As the trend famously codified by Intel co-founder Gordon Moore—that the number of transistors on an integrated circuit would double every two years—seems to be flagging, one top engineer suggests that it is time to rethink chip design to buy performance increases.
Instead of just focusing on reducing chip size and cranking up processor speeds, engineers need to look at making tweaks or possibly even change fundamental microprocessor architecture to ensure chips are faster and cheaper to produce, said Robert Colwell, director of the microsystems technology office at DARPA (Defense Advanced Research Projects Agency), in a talk on Monday at the Hot Chips conference in Stanford, California.
Colwell dismissed arguments that Moore’s Law will continue to hold, and said engineers should give serious thought to the design and economics of chip making.
“I pick 2020 as the earliest ... when we can call it dead. That’s only seven years away,” Colwell said. “I’m thinking 7-nanometers. You can talk me into 2022, you might be able to even talk me into 1-nanometer. But you’re not going to into 1-nanometer... I think physics dictates against that.”
Moore's Law Clarification
There have been different interpretations of Moore’s Law, with the most common one being that the number of transistors on a chip will double every two years, which will make chips faster. But Colwell tried to clarify the definition, saying that in 1965 Intel’s Gordon Moore focused more on the economics related to cost-per-transistor, which would drop with scaling.
“What it really is ... if you’re going to integrate a lot of components on a chip, there’s an optimal place on that curve where you should do that. You can go beyond the optimal point, but it will cost you in terms of price per component,” Colwell said, adding there’s a sweet spot where maximum profit can be eked out assuming sales are relative to the number of chips made.
It is true that beyond a certain geometry it will hard to make chips smaller, but Colwell said economics, and not physics, would ultimately end Moore’s Law. The day chip makers can’t get return on the billions invested in making chips smaller is the day Moore’s Law will break. Instead of waiting for chip economics to crash, innovation should start now.
Chips now have billions of transistors, and the ability to push clock speed and performance will reached its limit, and after silicon engine stalls, small tricks and incremental tweaks like power gating and turbo will happen for a short time to improve chips. But chip designers should start early, and an effort like changing fundamental chip design could help before and after Moore’s Law ends.
One approach to consider might be separating the instruction set architecture, microarchitecture, circuits, functional blocks and other parts now integrated onto a chip, and tweaking them for specific applications, Colwell said.
“I think the end of Moore’s Law opens the door to designing special-purpose things again,” Colwell said, adding that in the 1970s, one could make specialized floating point arrays with vector processors. DARPA is doing research in the areas of quantum computing, nanotechnology and distributed computing, he said.
Replacements for silicon
Researchers at universities and chip companies are also looking at new materials to replace silicon, and also advanced manufacturing technologies. Colwell said that the new technologies are far from practical implementation, and chip makers will have to rely on technologies like CMOS, which has no practical replacement in sight.
“CMOS is really good stuff,” Colwell said. “There are [only] two or three [new technologies] that are promising at all. It’s just hard to beat CMOS.”
Some ways to buy performance in the meanwhile could be through the use of new materials, photonics, optics and 3-D stacking, in which transistors are placed on top of each other.
Outside of the computer industry, the auto industry will feel the biggest impact of the end of Moore’s Law, Colwell said. The last 30 years of innovation in cars such as navigation systems, antilock breaks, guidance systems and others have all been driven by semiconductors.
“I think that’s really cool but all of it is based on computers. If we stall out, what are they going to do differently from generation to generation?” Colwell said. “I think they have been living off the electronics for the last 20 to 30 years, and if we don’t continually feed them huge increases, it’s not clear what they will do next.”
Colwell also threw a dart at his former employer, Intel, where he was the chief architect for all Pentium chips.
“Intel is terrible at anticipating. They don’t look down the road and say ‘five years from now the rules will be different, I need to react today, I’m going to put some bets on the table’. There’s some of that, but not a lot. But what they are really good at is reacting,” said Colwell.