-640K- 2^256 Bytes of Memory is More than Anyone Would Ever -Need- Get

-640K- 2^256 Bytes of Memory is More than Anyone Would Ever -Need- Get

By Sergey Ignatchenko

Overload, 20(112):14-15, December 2012


How fast can computers get? Sergey Ignatchenko provides us with some upper limits.

Disclaimer: as usual, the opinions within this article are those of ‘No Bugs’ Bunny, and do not necessarily coincide with the opinions of the translator or the Overload editor. Please also keep in mind that translation difficulties from Lapine (like those described in [Loganberry04] ) might have prevented providing an exact translation. In addition, both the translator and Overload expressly disclaim all responsibility from any action or inaction resulting from reading this article.

There is a famous misquote commonly and erroneously attributed to Bill Gates: “ 640K of memory is all that anybody with a computer would ever need. ” Apparently, Gates himself has denied that he has ever said anything of the kind [ Wired97 ]. Reportedly, he went even further, saying “ No one involved in computers would ever say that a certain amount of memory is enough for all time. ” [Wired97] Well, I, ‘No Bugs’ Bunny, am involved in computers and I am saying that while there can be (and actually, there is) a desire to get as much memory as possible, physics will certainly get in the way and will restrict any such desire.

Moore’s Law vs Law of Diminishing Returns

What goes up must come down
~ proverb

There is a common perception in the computer world that all the current growth in hardware will continue forever. Moreover, even if such current growth is exponential, it is still expected to continue forever. One such example is Moore’s Law; originally Moore (as early as 1965, see [ Moore65 ]) was referring to doubling the complexity of integrated circuits every year for next 10 years, i.e. to 1975 (!). In 1975, Moore adjusted his prediction to doubling complexity every two years [ Moore75 ], but again didn’t go further than 10 years ahead in his predictions. As it happens, Moore’s law has stood for much longer than Moore himself had predicted. It was a great thing for IT and for everybody involved in IT, there is no doubt about it. With all the positives of these improvements in hardware, there is one problem with such a trend though – it has led to the perception that Moore’s Law will stand forever. Just one recent example – in October 2012, CNet published an article arguing that this trend will continue for the foreseeable future [ CNet12 ]; in particular, they’ve quoted the CTO of Analog Devices, who said: “ Automobiles and planes are dealing with the physical world. Computing and information processing doesn't have that limitation. There's no fundamental size or weight to bits. You don't necessarily have the same constraints you have in these other industries. There potentially is a way forward.

There is only one objection to this theory, but unfortunately, this objection is that this theory is completely wrong. In general, it is fairly obvious that no exponential growth can keep forever; still, such considerations cannot lead us to an understanding of how long it will continue to stand. In practice, to get any reasonable estimate, we need to resort to physics. In 2005, Moore himself said “ In terms of size [of a transistor] you can see that we’re approaching the size of atoms which is a fundamental barrier, but it’ll be two or three generations before we get that far – but that’s as far out as we’ve ever been able to see. ” [ Moore05 ] Indeed, 22nm technology already has transistors which are just 42 atoms across [Geek10]; and without going into very different (and as yet unknown) physics one cannot possibly go lower than 3 atoms per transistor.

Dangers of relying on exponential growth

Anyone who believes exponential growth can go on forever in a finite world is either a madman or an economist.
Kenneth Boulding, economist

In around the 2000s, Moore’s Law had been commonly formulated in terms of doubling CPU frequency every 2 years (it should be noted that it is not Moore’s formulation, and that he shouldn’t be blamed for it). In 2000, Intel has made a prediction that by 2011, there will be 10GHz CPUs out there [ Lilly10 ]; as we can see now, this prediction has failed miserably: currently there are no CPUs over 5GHz, and even the only 5GHz one – POWER6 – is not produced by Intel. Moreover, even IBM which did produce POWER6 at 5GHz, for their next-generation POWER7 CPU has maximum frequency of 4.25 GHz. With modern Intel CPUs, even the ‘Extreme Edition’ i7-3970XM is mere 3.5GHz, with temporary Turbo Boost up to 4Ghz (see also an extremely enthusiastic article in PC World , titled ‘New Intel Core I7 Extreme Edition chip cracks 3GHz barrier’ [ PCWorld12 ]; the only thing is that it was published in 2012, not in 2002). In fact, Intel CPU frequencies have decreased since 2005 (in 2005, the Pentium 4 HT 672 was able to sustain a frequency of 3.8GHz).

One may say, “ Who cares about frequencies with all the cores around ” – and while there is some point in such statement (though there are many tasks out there where performance-per-core is critical, and increasing the number of cores won’t help), it doesn’t affect the fact – back in 2000 nobody had expected that in just 2 years, all CPU frequency growth would hit a wall and that frequency will stall at least for a long while.

It is also interesting to observe that while there is an obvious physical limit to frequencies (300GHz is already commonly regarded as a border of infra-red optical range, with obviously different physics involved), the real limit has came much earlier than optical effects have started to kick in.

Physical limit on memory

The difference between stupidity and genius is that genius has its limits.
Albert Einstein

As we’ve seen above, exponential growth is a very powerful thing in a physical world. When speaking about RAM, we’ve got used to doubling address bus width (and address space) once in a while, so after move from 16-bit CPUs to 32-bit ones (which has happened for mass-market CPUs in mid-80s) and a more recent move from 32-bit CPUs to 64-bit ones, many have started to expect that 128-bit CPUs will be around soon, and then 256-bit ones, and so on. Well, it might or might not happen (it is more about waste and/or marketing, see also below), but one thing is rather clear – 2 128 bytes is an amount of memory which one cannot reasonably expect in any home device, with physics being the main limiting factor. Let’s see – one cubic cm of silicon contains around 5*10 22 atoms. It means that even if every memory cell is only 1 atom large, it will take 2 128 /(5*10 22 )*8 cm 3 of silicon to hold all that memory; after calculating it, we’ll see that 2 128 bytes of memory will take approximately 54 billion cubic metres (or 54 cubic kilometres) of silicon. If taking other (non-silicon-based) technologies (such as HDDs), the numbers will be a bit different, but still the amount of space necessary to store such memory will be a number of cubic kilometres, and this is under an absolutely generous assumption that one atom is enough to implement a memory cell.

To make things worse, if we’re speaking about RAM sizes of 2 256 bytes, we’ll see that implementing it even with 1 atom/cell will take about 10 78 atoms. Earth as a planet is estimated to have only 10 50 atoms, so it will take ten billion billion billions of planets like Earth to implement a mere 2 256 bits of memory. The solar system, with 10 57 atoms, still won’t be enough: the number we’re looking for is close to number of atoms in the observable universe (which is estimated at 10 79 –10 80 ). In other words – even if every memory cell can be represented by a single atom, we would need 1 to 10% of all the stars and planets which we can see (with most of them being light years afar), to implement 2 256 bytes of memory. Honestly, I have serious doubts that I will live until such a thing happens.

On physics and waste of space

Architecture is the art of how to waste space.
Philip Johnson

It should be noted that the analysis above is based on two major assumptions. First, we are assuming that our understanding of physics is not changed in a drastic manner. Obviously, if somebody finds a way to store terabits within a single atom, things will change (it doesn’t look likely in the foreseeable future, especially taking the uncertainty principle into account, but strictly speaking, anything can happen). The second assumption is that when speaking about address space, we are somewhat assuming that address space is not wasted. Of course, it is possible to use as much as a 1024-bit address space to address a mere 64K of RAM, especially if such an address space is allocated in a manner similar to the allocation of IPv4 addresses in early days (“ here comes IBM, let’s allocate them as small portion of the pool – just class A network, or 1/256 of all IP addresses ”). If there is a will to waste address space (which can be driven by multiple factors – from the feeling that space is infinite, like it was the case in early days of IPv4 addresses, to the marketing reason of trying to sell CPUs based on perception that a 128-bit CPU is better than a 64-bit one just because of the number being twice as big) – there will be a way. Still, our claim that ‘ 2 256 bytes of memory is not practically achievable’ stands even without this second assumption. In terms of the address bus (keeping in mind that an address bus is not exactly the same as an address space, and still relying on the first assumption above), it can be restated as ‘ 256-bit address bus is more than anyone would ever need ’.

References

[CNet12] Moore’s Law: The rule that really matters in tech. Stephen Shankland, CNet , Oct 2012, http://news.cnet.com/8301-11386_3-57526581-76/moores-law-the-rule-that-really-matters-in-tech/

[Lilly10] Where are Intel’s 10GHz Processors Hiding? Paul Lilly, 2010 http://www.maximumpc.com/article/news/where_are_intels_10ghz_processors_hiding

[Loganberry04] David ‘Loganberry’, Frithaes! – an Introduction to Colloquial Lapine!, http://bitsnbobstones.watershipdown.org/lapine/overview.html

[Moore65] ‘Cramming more components onto integrated circuits’, Moore, G. Electronics Magazine , 1965

[Moore75] Progress In Digital Integrated Electronics, Gordon Moore, IEEE Speech, 1975

[Moore05] Moore’s Law is dead, says Gordon Moore Manek Dubash, TechWorld http://news.techworld.com/operating-systems/3477/moores-law-is-dead-says-gordon-moore/

[PCWorld12] New Intel Core I7 Extreme Edition chip cracks 3GHz barrier. PC World , Sep 2012, http://www.pcworld.com/article/261873/new_intel_core_i7_extreme_edition_chip_cracks_3ghz_barrier.html

[Wired97] Did Gates Really Say 640K is Enough For Anyone? -- John Katz, Wired , 1997

Acknowledgement

Cartoon by Sergey Gordeev from Gordeev Animation Graphics, Prague.






Your Privacy

By clicking "Accept Non-Essential Cookies" you agree ACCU can store non-essential cookies on your device and disclose information in accordance with our Privacy Policy and Cookie Policy.

Current Setting: Non-Essential Cookies REJECTED


By clicking "Include Third Party Content" you agree ACCU can forward your IP address to third-party sites (such as YouTube) to enhance the information presented on this site, and that third-party sites may store cookies on your device.

Current Setting: Third Party Content EXCLUDED



Settings can be changed at any time from the Cookie Policy page.