Jump to content

Koomey's law

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Rjwilmsi (talk | contribs) at 19:58, 22 December 2016 (Journal cites: fix page range dash, templated 1 journal cites (Diberri fmt authors) using AWB (12134)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Computations per kWh, from 1946 to 2009

Koomey’s law describes a long-term trend in the history of computing hardware. The number of computations per joule of energy dissipated has been doubling approximately every 1.57 years. This trend has been remarkably stable since the 1950s (R2 of over 98%) and has actually been somewhat faster than Moore’s law. Jonathan Koomey articulated the trend as follows: "at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half."[1]


Implications

The implications of Koomey’s law are that the amount of battery needed for a fixed computing load will fall by a factor of 100 every decade.[2] As computing devices become smaller and more mobile, this trend may be even more important than improvements in raw processing power for many applications. Furthermore, energy costs are becoming an increasing factor in the economics of data centers, further increasing the importance of Koomey’s law.

History

Koomey was the lead author of the article in IEEE Annals of the History of Computing that first documented the trend.[1] At about the same time, Koomey published a short piece about it in IEEE Spectrum.[3]

It was further discussed in MIT Technology Review,[4] and in a post on the “Economics of Information” blog,[2] and at The Economist online.[5]

The trend was previously known for digital signal processors, and it was then named "Gene's law". The name came from Gene Frantz, an electrical engineer at Texas Instruments. Frantz had documented that power dissipation in DSPs had been reduced by half every 18 months, over a 25 year period.[6][7]

The end of Koomey's law

By the second law of thermodynamics and Landauer's principle, irreversible computing cannot continue to be made more energy efficient forever. As of 2011, computers have a computing efficiency of about 0.00001%.[8] Assuming that the energy efficiency of computing will continue to double every 1.57 years, the Landauer bound will be reached in 2048. Thus, after about 2048, Koomey's law can no longer hold.

Landauer's principle, however, is not applicable to reversible computing. With reversible computing, though, computational efficiency is still bounded: by the Margolus–Levitin theorem. By the theorem, Koomey's law has the potential to be valid for about 125 years.[citation needed]

See also

References

  1. ^ a b "Implications of Historical Trends in the Electrical Efficiency of Computing]". IEEE Annals of the History of Computing. 33 (3): 46–54. March 29, 2010. doi:10.1109/MAHC.2010.28. ISSN 1058-6180. {{cite journal}}: Unknown parameter |authors= ignored (help)
  2. ^ a b Is Koomey's Law eclipsing Moore's Law?
  3. ^ Koomey J. G. (26 Feb 2010), "Outperforming Moore's Law", IEEE Spectrum.
  4. ^ Greene, Kate (September 12, 2011). "A New and Improved Moore's Law". MIT Technology Review.
  5. ^ "Computing power - A deeper law than Moore's?", The Economist online (October 10, 2011)
  6. ^ T. Farncombe & K. Iniewski (2013), Medical Imaging: Technology and Applications, §1.7.4 Power Dissipation, pp. 16-18, CRC Press.
  7. ^ Frantz G (2000). "Digital signal processor trends". IEEE Micro. 20–6: 52–59. {{cite journal}}: External link in |title= (help)
  8. ^ "Tikalon Blog by Dev Gualtieri". Tikalon.com. Retrieved July 2, 2015.

Further reading