Gordon Moore's famous law about the doubling of transistor density and power every two years will not only end it could bring economic disaster in its wake, respected scientist Michio Kaku has predicted in a new book, Physics of the Future.
Kaku sets out the crunch moment as being the point at which ultraviolet light can no longer tuned to etch ever smaller circuits on to silicon wafers, which on current trends will kick in less than a decade from now. From that moment on, Moore's Law will gradually diminish, and the effects will not only be technological but economic.
He argues that the computing industries depend on a conveyor belt of new products which roughly double the power of each new product from the equivalent a year or two earlier. With no Moore's Law to propel this rise in computing power, this upgrade culture will grind to a halt causing consumer interest to wane.
"Around 2020 or soon afterward, Moore's law will gradually cease to hold true and Silicon Valley may slowly turn into a rust belt unless a replacement technology is found," says Kaku in an extract published on Salon.com website.
"Transistors will be so small that quantum theory or atomic physics takes over and electrons leak out of the wires. At that point, according to the laws of physics, the quantum theory takes over," says Kaku, invoking one of science's most feared laws, The Heisenberg uncertainty principle.
His point is stark. Once the most basic unit of computing work - the electron with a measurable behaviour inside a wire - becomes uncertain, as it surely will at these scales, the silicon age is over. Any smaller and science has no way of knowing where an electron is in order to put it to work in a transistor.
Kaku's pronouncements on the limits of Moore's Law are nothing new and have been argued over almost since it was first mooted by Moore himself in the 1960s. In 2005, even Moore himself saw problems in applying the exponential to today's computing environment, although Intel executives continue to make optimistic pronouncements in public.
Kaku's thesis is interesting, however, in focussing on the economic consequences of its demise and the extent to which high-tech companies and whole economies are vulnerable.
He reminds us of how much the world has come to depend on computing power that is now taken utterly for granted. For example, the primitive chip inside a birthday greetings card has more processing power than the Allied armies had at their disposal in 1945.
"Hitler, Churchill, or Roosevelt might have killed to get that chip. But what do we do with it? After the birthday, we throw the card and chip away," he says.
Arguments can be raised against his pessimism if not his physics. The first is that while the basic unit of computing power might stop advancing due to physical barriers, these units could be deployed in parallel to do more useful work. The world will need to think of ways to deploy this basic unit of power more efficiently, which is tends not to need to do today because of Moore's Law itself. That will buy some time.
Further out, there's also the wildcard of quantum computing, a design for performing calculations that seeks to harness the principles that for Kaku are disturbingly close to sinking the computing age for good. If such a vision is to leave the science lab where it has been stuck for some years, it will of course still have to overcome Heisenberg's tricky measurement paradox first.
If commercial quantum computing does come to pass, some believe a much bigger problem than fundamental physics starts to afflict the human obsession with building more and more complex computers, namely which problems will such powerful devices solve? Quantum computers might be perfectly suited to solving the deepest conundrums of the universe but perhaps not driving the 2050's equivalent of an iPod.
This story, "Death of Moore's Law Will Cause Economic Crisis" was originally published by Techworld.com.