April 19th marks 50 years since Gordon Moore first penned what is now known as "Moore's Law."
It's the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years.
In many ways, it's a bit of a mangling of Koomey's Law, which states that the number of computations per joule of energy dissipated has been doubling approximately every 1.6 years.
Koomey's Law has the appeal of being defined in terms of basic physics, rather than as technological artefacts. Hence, I personally prefer Koomey's Law, even though Moore's Law is far, far more famous.
There's another interesting aspect to Koomey's Law: it hints at an answer to the question "for how long can this continue?" ... with the hinted answer being "until about 2050."
Because by 2050, computations will require so little energy that they will face a fundamental thermodynamic constraint (Landauer's Principle).
For leveity, some might be inclined to pen "the number of silly people causing distraction by predicting the end of Moore's Law doubles every eighteen months," and call it Hruska's Law?
It's probably very worthwhile to become familiar with Rock's Law, which is where POET offers the foundries some real economic payback.
GLAL,
R.
