Intel Is Dead on the Desktop, Says ARM Co-Founder
Editor's note: The headline of this story has changed. It originally, incorrectly named AMD instead of ARM. We regret the error.
Intel is doomed, Hermann Hauser has claimed in an interview with the Wall Street Journal . If you don't know who Hauser is, he happens to be one of the co-founders of ARM--possibly Intel's most dangerous foe in the semiconductor marketplace, when also-rans like AMD and VIA are removed from the equation.
Because of his background, Hauser isn't the most impartial of sources. However, he does offer a couple of compelling arguments. His first is that ARM licenses chip designs to third-parties, which means Intel competes against not against ARM but against every ARM chip maker in the world. Effectively, Intel is fighting a clone army--perhaps ironically, not entirely unlike the world of desktop computing in the 1980s, which killed off any computer that wasn't a clone of an IBM PC with an Intel processor inside.
Secondly, Hauser sees the history of computing as occurring in waves, starting with the mainframe in the 1950s. Each wave has brought to the surface various companies which then sunk without trace (DEC? Apollo? Sun?). Hauser reckons the age of the desktop computer is now at an end and the next wave will be largely mobile in nature.
Hauser says that "now it's the mobile architecture that is going to be the main computing platform at least on the terminal side," possibly making an oblique reference to cloud computing. Commentators suggest ARM is positioning itself at both ends of the cloud computing space, on both portable devices and in the data center, so this would make sense.
The truth is perhaps less black and white. As soon as any company gains market dominance, people begin to say that its days are numbered and the downfall of the Wintel monopoly has been forecast for some time.
Intel has indeed lost significant ground to ARM chips, and Microsoft faces equally annoying competition from the likes of Google's Android, which is climbing onboard practically every computer that isn't a desktop PC or server.
However, the chief problem is one of irrelevance. As time goes on, Microsoft and Intel are starting to matter less. We can blame open source, which has consistently proved that computing architecture is of less importance than we might think: Linux doesn't care whether it's running on PowerPC, x86/x86-64, ARM, or whatever you can find. Android is, of course, based on Linux.
Android is also subtly shifting our understanding of the purpose of an operating system. Android is a means to an end for Google. The better Android is and the more it lets us do, the more of our data Google can potentially get access to. And data is Google's raison d'être. By way of comparison, Windows is an end in itself--a dead end. Microsoft gains little benefit from Windows other than the income from software licenses, which is starting to sound like a very old-fashioned way of thinking in this age of mobile devices and data clouds.
Microsoft and Intel have nothing to worry about in the short term. While Hauser is probably right that we're moving to a world of mobile devices, nothing has come along yet that replaces either the desktop or laptop computer in terms of outright usability. The mobile device du jour is the tablet, which is great for simple and fun tasks, but trying to create a detailed presentation on one can be classed as a form of torture.
The future may in hybrid devices, and in particular an old mobile computing favorite: the docking station. I've little doubt that, right now in labs across Silicon Valley, various experimental designs mixing tablets, laptops and desktop computers will be undergoing development. The screen component will be the brains of the unit, and will effectively be a tablet computer that can be detached and carried around. For more in-depth work, users will be able to snap it back into the laptop base unit and utilize a touchpad and keyboard.
That's my vision of the future, anyway. Remember that you heard it here first.