Shop OBEX P1 Docs P2 Docs Learn Events
Overclocking: What makes it limited? — Parallax Forums

Overclocking: What makes it limited?

MicrocontrolledMicrocontrolled Posts: 2,461
edited 2012-01-08 16:24 in General Discussion
Hi, basically, I want to know what limits overclocking. Temerature? Voltage? If I were to overclock a chip, which of these would have to be adjusted in order to not destroy the chip?

Thanks,
Microcontrolled

Comments

  • BitsBits Posts: 414
    edited 2012-01-07 07:54
    The impurities in the silicone creates resistance etc. No such thing as pure silicone so this makes for a natural speed limit. Also design can affect speed.
  • Mike GreenMike Green Posts: 23,101
    edited 2012-01-07 08:09
    Lots of things limit overclocking ... There are charts in the datasheet that show the relationships between operating voltage and maximum clock speed and between operating temperature and maximum clock speed.

    The amount of heat produced by the chip goes up the faster it goes (because it's the switching transitions that produce most of the heat and those happen more often).

    The amount of heat produced goes up as the supply voltage goes up and the chip is capable of running faster as the supply voltage goes up. The extra voltage shoves those little electrons around faster.

    You have to externally cool the chip (refrigerate it) as it generates extra heat. Heat tends to cause leakage of electrons from structures that should be insulators, tends to cause resistance to increase as the thermal vibrations interfere with electron movement, etc.

    The chip actually can run faster as it's cooled ... another reason to use external cooling.

    The structures on the chip (mostly the MOSFET gates) are built to withstand only a certain amount of voltage (about 4V in the case of the Prop I) without damage. As you raise the supply voltage, there's less and less margin between the logic levels and this physical limit. Eventually some gates get zapped and the chip stops working. One of the weak links seems to be the system clock multiplexor. It seems to be the first thing to fail when there are supply voltage surges across the chip.

    Some individual chips can run faster than others. There are slight variations in the chemistry and physics across the wafer and from wafer to wafer when the chip is manufactured. You can select for the faster chips by buying a bunch of them and running them through a test rig with an environmental chamber, an adjustable clock, and a fancy test program. I think the logic around the flags (C and Z) is the most clock sensitive and tends to give erroneous results first when overclocked.
  • BitsBits Posts: 414
    edited 2012-01-07 08:22
    I wish I could recall where I read that a CPU is the hottest device per square mm that man (*cough* woman) makes.

    I still say it all boils down to the semiconductor material being impure. Soon diamonds will replace silicone (a women's best friend, lol) and the speed limit will increase drastically.
  • mindrobotsmindrobots Posts: 6,506
    edited 2012-01-07 08:33
    ummmm.....siliCON, not siliCONE.......both may be hot and man and women's friends but their properties and uses are dramatically different! :smile:
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-07 08:46
    mindrobots wrote: »
    .....siliCON, not siliCONE....... their properties and uses are dramatically different! :smile:

    Don't feel bad. I've heard of Hollywood plastic surgeons getting those two materials mixed up, too.

    The result is that some movie stars awake in the hospital to find themselves stuck between a rock and a hard place.
  • mindrobotsmindrobots Posts: 6,506
    edited 2012-01-07 08:55
    Bah-dum-bum!

    Hey, folks, he's here all weekend. Try the veal, next time you're out!

    Seriously, you might want to read the wikipedia article on over clocking, it points out a lot of the issues that Mr. green addressed and of course leads to an hour odor two of lost time traded for gained knowledge.
  • Martin_HMartin_H Posts: 4,051
    edited 2012-01-07 09:05
    Back in the late 80's I read some articles on the thermodynamics of computation. Basically IBM discovered that irrespective of the underlying technology, operations which destroy information create more heat than reversible operations. This linkage occurs because physical entropy and information entropy are linked. So in theory a computer instruction set that could run both ways (e.g. XOR) would generate much less heat than a normal instruction set that contained instructions like clear memory.

    This was thought to be an intellectual curiosity and no such machine was possible because it required an infinite number of states. This was conventional wisdom because physical and logical reversiblity were thought to be linked. Well research by Charles H. Bennett of IBM and Ed Fredkin of MIT implied they are not linked. Ed Fredkin is also the inventor of the Fredkin gate which is a logically reversible gate.

    To my knowledge no computers have yet been built which reduce their waste heat via a reduction in information entropy created. But I imagine that it might become more relevant when we bump up against physical limits that bring an end to Moore's Law.
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-07 11:23
    Martin_H wrote: »
    ...operations which destroy information create more heat than reversible operations. This linkage occurs because physical entropy and information entropy are linked.....

    Amazing. That's more than a little freaky, in my humble opinion. That seems to suggest information is some kind of absolute thing rather than a relative thing requiring an observer to "get it". Or could this be an artifact of how "information" is defined in physical terms?
  • MicrocontrolledMicrocontrolled Posts: 2,461
    edited 2012-01-07 11:32
    Thanks for the replies!
    My reason for this thread was because some time back I read an article, I don't know where, about someone who sped up his 2.1GHz processor to near 5GHz by using liquid nitrogen as a coolant. I started to consider just what the temperature has to do with the operation of a CPU, and I found this interesting video: http://www.youtube.com/watch?v=W8lS7KoOEeo

    I figured, why can't you just supercool a chip to triple it's clock rate? Or better? What is the limitation on how much you can overclock? If you continue to cool the chip and raise the voltage, when will it stop increasing in speed?
    Also, I'm wondering if I could try this on a Propeller chip. If you gave it a 10MHz Xtal, stuck it in the freezer and applied 4.5V, would it speed up, or even survive?

    Thanks for the responses!
  • Martin_HMartin_H Posts: 4,051
    edited 2012-01-07 12:26
    Amazing. That's more than a little freaky, in my humble opinion. That seems to suggest information is some kind of absolute thing rather than a relative thing requiring an observer to "get it". Or could this be an artifact of how "information" is defined in physical terms?

    ElectricAye, it is freaky, and at first I thought like you did that information is an artifact of a physical system, so it was a byproduct of the physical process. But then, I read more about quantum mechanics, the double slit, and quantum eraser experiments. In these experiments having information about a physical system effects the outcome of the physics.

    There's way to much detail to do justice here, so I suggest you read about them if this interests you. But the bottom line is that information plays a key role in many physical systems and it is a thing in its own right.
  • localrogerlocalroger Posts: 3,452
    edited 2012-01-07 12:45
    Bottom line: it's all ones and zeroes.

    But the ones and zeroes do go faster when immersed in liquid nitrogen.
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-07 15:02
    Martin_H wrote: »
    ...But the bottom line is that information ...is a thing in its own right.

    Wow, this seriously freaks me out. It seems to have all kinds of profound implications. It makes me wonder what the implications are to evolution, for example, especially the evolution of the brain. It seems to add some interesting flourishes to the old question of how mathematics seems so good at representing the natural world. Is mathematics merely an invention of ours or does it seem to tap into some "thing out there", the old arguments of Platonic forms and such. I'll definitely have to read up on this. Probably way over my head, I'm guessing.
  • frank freedmanfrank freedman Posts: 1,983
    edited 2012-01-07 17:17
    Mike Green wrote: »
    Lots of things limit overclocking ... There are charts in the datasheet that show the relationships between operating voltage and maximum clock speed and between operating temperature and maximum clock speed.

    The amount of heat produced by the chip goes up the faster it goes (because it's the switching transitions that produce most of the heat and those happen more often).

    The amount of heat produced goes up as the supply voltage goes up and the chip is capable of running faster as the supply voltage goes up. The extra voltage shoves those little electrons around faster.

    You have to externally cool the chip (refrigerate it) as it generates extra heat. Heat tends to cause leakage of electrons from structures that should be insulators, tends to cause resistance to increase as the thermal vibrations interfere with electron movement, etc.

    The chip actually can run faster as it's cooled ... another reason to use external cooling.

    The structures on the chip (mostly the MOSFET gates) are built to withstand only a certain amount of voltage (about 4V in the case of the Prop I) without damage. As you raise the supply voltage, there's less and less margin between the logic levels and this physical limit. Eventually some gates get zapped and the chip stops working. One of the weak links seems to be the system clock multiplexor. It seems to be the first thing to fail when there are supply voltage surges across the chip.

    Some individual chips can run faster than others. There are slight variations in the chemistry and physics across the wafer and from wafer to wafer when the chip is manufactured. You can select for the faster chips by buying a bunch of them and running them through a test rig with an environmental chamber, an adjustable clock, and a fancy test program. I think the logic around the flags (C and Z) is the most clock sensitive and tends to give erroneous results first when overclocked.

    Agreed that cooling a chip can help to a point. The other problem with trying to push a device to far beyond its rated speed has to do with the structures making up the device. The gates of all the logic in a chip are all basically capacitors. The higher the frequency you drive a device to the higher the capacitive effects will be. There is also a finite time to the channel under the gate reacting to the field on the gate. So, even if you could supercool the device, you will still hit the point where all the parasitic capacitances and channel switching times will cause the device to go unstable. Part of what I understand to be the reason for ever smaller geometries. Reduction in parasitic cap effects, faster gate switching times etc.

    Devices have come a long way from the hand drawn cells that were once digitized and replicated to create the logic for P-Channel Metal gate ICs

    Frank
  • Martin_HMartin_H Posts: 4,051
    edited 2012-01-07 17:44
    Wow, this seriously freaks me out. It seems to have all kinds of profound implications.... I'll definitely have to read up on this. Probably way over my head, I'm guessing.

    Indeed quantum mechanics could have profound impact if we had an interpretation that made sense. So read up on it and don't worry if you are confused. As Richard Feynman said, "I think I can safely say that nobody understands quantum mechanics."
  • frank freedmanfrank freedman Posts: 1,983
    edited 2012-01-07 19:47
    Martin_H wrote: »
    Indeed quantum mechanics could have profound impact if we had an interpretation that made sense. So read up on it and don't worry if you are confused. As Richard Feynman said, "I think I can safely say that nobody understands quantum mechanics."

    Try the title "In Search of Schroedinger's Cat"

    Frank
  • SRLMSRLM Posts: 5,045
    edited 2012-01-07 21:48
    The replies given so far deal in the speed of a chip already made, with respect to the physical transistor layer of hardware.

    But clock speed is also determined at design time and RTL or gate layer, with something called "the critical path". Since a gate takes some non-zero finite amount of time to switch from 1 to 0 or 0 to 1, your design will have to wait some amount of time for the input to a chain of gates to propagate through to the output. If you try to go faster your gates will end up in indeterminate states.

    Since the device has a clock (it's not asynchronous) the computer has a state represented in "memory" (registers, caches, ram, etc.). The basic way a computer works is to take a number from memory, do a computation, and store it back into memory. Each of those portions takes some non-zero amount of time since they are made up of gates. On a simple computer, the clock will tick and then each of those stages will happen in sequence. The sum of those three stages is the critical path: the computer cannot go any faster than the time it takes to do those three things.

    This is a basic rule of all silicon devices, and cannot be eliminated.

    There are, however, optimization that can be done. The most important of which is pipelining. Say you were to take the three stages ("take number from memory", "do computation", "store result") and make it so that each stage is 'chunked' from the other into a clock cycle*. Then, the entire run through will take three cycles total but look! Now your critical path is 1/3 the size **, so your clock can go three times as fast. In addition, you can now "fill the pipeline" by pushing through a new computation on each clock cycle. So in our hypothetical processor we can be doing three computations at once: 1st (storing the result), 2nd (doing computation), 3rd (getting number). Thus we have increased throughput at no additional cost.

    The Propeller has a 2 (or 3?) stage pipeline; mainstream chips can have 30 or more stages.

    The critical path is one of the most important concepts in computer architecture, and if you understand this then everything else in computer architecture is easy. The critical path is 'critical' for determining clock speed.

    *This is done with registers between each stage to store intermediate results.
    **Ok, technically it's the size of the largest critical path of the stages, but it's just details...
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-07 22:18
    Martin_H wrote: »
    ...As Richard Feynman said, "I think I can safely say that nobody understands quantum mechanics."

    I read Feynman's book QED some months ago, and if I remember correctly, his attitude was: if we can get the math to work out correctly, then we understand the system as well as possible. I know there are a lot of physicists who subscribe to that school of thought, but it always leaves me wondering whether or not it's sometimes just an elegant, if not merely convenient, curve fit.
    Try the title "In Search of Schroedinger's Cat"

    ...

    Thanks, I'll take a look at it next time I'm at the library. I'm fairly sure I've seen it on the shelf there.
  • Heater.Heater. Posts: 21,230
    edited 2012-01-08 05:32
    Electric Aye.
    ...it always leaves me wondering whether or not it's sometimes just an elegant,
    if not merely convenient, curve fit.

    curve fit! curve fit!. What, I wasted years of my life studying physics and it's
    all only an exercise in curve fitting?

    Curve fitting normally goes like this:
    1) Take a whole bunch of measurements of some physical phenomena or other and
    end up with say some kind of x y plot of data points.
    2) Get some function of a curve you could conceivably run through the those
    points in a nice smooth way. A straight line or a polynomial or whatever.
    3) Find a bunch of values for the coefficients of that function that make it a
    "good fit" to the data points you have collected.

    When done you can now use that function to interpolate values of your
    phenomena.

    Conceivably all of physics could be done like that. Let's say we had some
    universal function U that could be made to "fit" any and all measured phenomena
    just by finding appropriate coefficients for it. Then U is all we would ever
    need to describe any phenomena and make some kind of predictions from
    measurements.

    All of physics and science would just boil down to a search for the
    coefficients of U to use in any situation. No matter if U is not any meaningful
    description of anything. It can be just be made to fit.

    But wait, that search for coefficients for U would be pretty hard. There are an
    infinite number of settings for them and perhaps and infinite number of them. We
    would probably try to narrow the search by understanding something of the
    phenomena we are dealing with and the nature of how the function U behaves.

    That's no good. And beside the function U does not carry any meaning, it's just
    a function that can be made to fit the data. Like we can fit a polynomial to a
    bunch of random points. There is no satisfaction in that.

    No, it does not work like that. All that maths in physics is there for a
    reason. It's a model. It has descriptive and predictive power. That math is a
    language trying to communicate some understanding of what is going on.

    Perhaps only a model. But a model based on some kind of observation
    and reasoning. Not just a curve fit.

    OK. I have to admit that during my studies I have also had that "curve fit"
    feeling, That was normally the point at which I had ceased to understand what
    they were trying to say:)
  • localrogerlocalroger Posts: 3,452
    edited 2012-01-08 06:42
    If you wanted to build a computer capable of simulating the observable universe, it would need about 10^80 bits of memory to store the position and velocity of every particle to the precision at which Heisenberg allows us to observe them. While that is a very large number, what I have always found interesting about it is that it's not infinity.

    Now it's obvious that the universe tells us a very consistent story about how it works, a story that we have learned to read in terms of math and measurement. And we assume that that story is the emergent property of a lot of particles interacting via a few simple rules.

    But parts of that story are, for want of a better word, crazy. The story about stars and planets and cosmic background radiation and gravity and electromagnetic waves is pretty sensible. But the story about quantum states, orbitals, quarks, and bosons is pretty ... different ... and really doesn't seem like it was told by the same guy who wrote the one about planets and stars.

    It is a theoretical possibility that the story told by the universe isn't emergent but is deliberate, just as the physics rules in a video game don't actually emerge from the behavior of atoms and molecules. In philosophical circles this is called the "simulation paradox." How do we know the universe isn't actually a deliberately constructed fake? The answer, if it can lie to us with sufficient verisimilitude, is that we wouldn't.

    But the significance is manifold. For one thing, a simulated universe could be simulated with a much smaller computer -- tens of orders of magnitude smaller. After all, it's only in vanishingly rare situations that we actually observe individual atomic-scale particles. Nearly everything in ordinary experience is the result of bulk processes. But more importantly, in a universe where the laws of physics are not emergent they could be changed. There might be exceptions. And if the universe is lying to us with sufficient verisimilitude then science would be helpless to investigate such things, because the fundamental assumption of the scientific method is that the behavior of the universe is consistent.

    And while the usefulness of science is undeniable, it is a clear fact that the majority of humans believe in a situation much better described by the simulation than the emergent reality. It's true that they don't understand all of the implications, but a universe where miracles are possible is a universe where the story science tells us is a lie.
  • Martin_HMartin_H Posts: 4,051
    edited 2012-01-08 07:28
    Localroger, that is really Plato's Cave or Ren
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-08 10:22
    Heater. wrote: »
    ...
    1) Take a whole bunch of measurements of some physical phenomena or other and
    end up with say some kind of x y plot of data points.
    2) Get some function of a curve you could conceivably run through the those
    points in a nice smooth way. A straight line or a polynomial or whatever.
    3) Find a bunch of values for the coefficients of that function that make it a
    "good fit" to the data points you have collected.....

    For sure, I would classify the above as a non-elegant curve fit.
    By contrast, F=ma might be considered an elegant curve fit. After all, F=ma only means something within a certain range of values. Outside that range, what does it mean?
    Heater. wrote: »
    ...That math is a language trying to communicate some understanding of what is going on..
    That certainly appears to be true. But that's where my nervous breakdown occurs. While it does seem to be a language, whose language is it? Is it something our human brains have devised to curve fit the cosmos, or does it actually have some deeper connection to the world "out there". It's a debate that's gone on since Plato and probably even before then. It's baffling and mysterious. Why should brains that evolved during the last ice age have anything to say about the deep structure of the cosmos?
  • Heater.Heater. Posts: 21,230
    edited 2012-01-08 11:29
    I would still make a big distinction between:
    1) Curve fitting, (finding coefficients that make some function fit the facts regardless of any meaning attached to that function)
    2) Finding a function that describes the situation with some grounding in reasoning from observations.

    It's a wonderful debate. Often the mathematics exists and is thought about for ages with no practical use at all. Just an amusement for mathematicians if you will. Then one day it turns out that that maths is exactly what fits some physical phenomena (within some domain, as you point out). That does seem magical and mysterious.

    As for the human brain. Perhaps having ideas about the deep structure of the cosmos is not actually much more different than a chimp realizing he can smash a hard nut with a big rock. The fact that we think that way has caused us to survive without any big teeth or claws or other defenses and in all kinds of harsh environments. The fact that the thinkers survived causes us as a race to continue to think that way.
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-08 13:44
    Heater. wrote: »
    ...Perhaps having ideas about the deep structure of the cosmos is not actually much more different than a chimp realizing he can smash a hard nut with a big rock.....

    Yeah, but it seems to me that the idea that these nut-busting neural circuits can somehow extend to predicting the existence of antimatter via pencil and paper is justification for either:

    1.) philosophical revolution/existential crises/psychological upheaval/ loss of faith in all things academic.

    2.) having another beer.

    3.) running naked through the woods and howling at the moon.

    4.) all of the above.
  • Heater.Heater. Posts: 21,230
    edited 2012-01-08 14:33
    ElectricAye.

    I have to relate a personal story pertaining to this mysterious mathematical/philosophical debate. Something that I found very deep, mysterious and even disturbing...

    I'm taking a 3 hour physics paper at university. It's the last paper after years of study. It's a general question paper, the questions are not related to any particular course we have been taking. It's the notorious "Paper 13". I only have to answer 3 questions, one from each of three sections. The easy section, that hard section and the impossible section. The pressure is on, having been diverted by the joys of youth for a long while I was convinced I had failed pretty much everything else that year. A severe dose of flu does not help the situation.

    Well, section one was easy. The rest I looked at in blind terror not understanding the questions let alone any possible answers. I scanned section two a hundred times, it was hopeless. I gave up. Feeling rather sick I did not leave the examination hall, just sat there sharpening pencils and doodling.

    Then it happened...The solution to one of the section two questions jumped into my mind, an equation we were supposed to derive. I had not derived it. It just appeared in my mind. I KNEW in my guts that it must be right. But what to do? Writing down the equation is not a sufficient answer, you have to show your derivation.

    Well, after an hour of fiddling around with it I managed to connect it, working backwards, to the question. The I wrote down the derivation in the expected order as my answer. It was right!

    To this day I wonder "How the hell did that happen?" or "where did that come from?" or "What actually happened at that moment?". Some how my "nut-busting neural circuits" or sub subconscious threw up the correct answer with no apparent rhyme or reason that I was aware of.

    I sometimes think that mathematicians have this experience all the time. That maths does not move from equation A to equation Z as presented in the text books. Rather solutions pop into minds along with that "it must be right" feeling. Then the poor old mathematician then has to spend years finding the proof. Effectively working backwards.

    Sadly that has only happened to me that one time.

    It's enough to convince me that physics is not just "curve fitting" anyway.
  • ElectricAyeElectricAye Posts: 4,561
    edited 2012-01-08 15:19
    Heater. wrote: »
    ....

    I have to relate a personal story pertaining to this mysterious mathematical/philosophical debate. ....

    A fascinating account. If I remember correctly, that's how Einstein came up with one of his famous ideas. I can't remember which one, but scholars found his notes "working backwards" from the original concept. It was as though he saw the answer, then had to backtrack so his mathematics could essentially lead other people from what was already known to whatever this new idea was.

    Also, I think Dirac had a similar experience. And I think Feynman even laughs about this in his QED.

    I had a similar thing happen to me in college but it had to do with mechanical design. One night I had the solution to a problem pop into my mind even though I wasn't even thinking about it at the time. Strangest thing is this: I could not see the solution but I just knew that I knew the answer. It was as though my emotions knew the answer but my logical mind couldn't see it. It was similar to having somebody's name on the tip of your tongue. I kept trying to work on what I knew was the answer, doodling and sketching like crazy, but I couldn't get it. Then, finally, after a full night of struggling with it, I could visualize the solution in 3-D. It was obvious that my 2-D sketches were on to it, but the lack of a third dimension on my paper kept me stuck. After I told my graduate school colleagues how the thing worked, they assured me that it was physically impossible. They were having the same problem I had suffered when it came to visualizing it. So it wasn't until I built a quick 3-D model of it that it made any sense to them.

    But as to the debate about math being a curve fit or not: I confess that I'm still on the fence about that one. Sometimes I'm sure that mathematics must have some deep connection to nature. But at other times, I'm sure that if we were to ever encounter an alien intelligence, that alien organism wouldn't have a clue how our mathematical inventions do what they do.
  • HumanoidoHumanoido Posts: 5,770
    edited 2012-01-08 15:53
    There are parts to the human mind that we don't yet understand. When encountering a technically challenging problem at work that no one could solve, I merely went home, reviewed the data before bedtime, and dreamed the solution. When I awoke, I wrote down the complete solution. That's why I always kept a spiral bound notepad and pen at the side of the bed. In recent years, I have noticed this effect can be invoked by techniques of relaxation such as "mental Tai Chi" to open the mind. To those who are aware and can harness this technique, the overclocking mind results will be priceless. It can lead to more solutions, increased wages, advancement in the workplace, better decision results, and higher levels of creativity. I'm looking at incorporating an effect similar to this in a machine brain that can dream.
  • localrogerlocalroger Posts: 3,452
    edited 2012-01-08 16:24
    Martin_H wrote: »
    Sextus ideas were incorporated into science as methodological naturalism. Basically knowing how things appear to be is useful, even if that is ultimately not how they are.

    Well duh. I am driven by the idea that so much of what bothers us is just backwash. Higgs Boson? Really?

    The whole atomic bomb thing has condiioned us to think physics is the most importantist thang evar. That's even what got my father into the lab where I grew up and got my own grounding in the reality of it all. But the most important thing in Dad's lab wasn't actually the Californium nuclear source, at least to me. It was the HP2100A computer.
Sign In or Register to comment.