CPUs: The More Cores the Merrier
Within a few short years, AMD Athlon X2 and Intel Core 2 Duo CPUs will feel decidedly quaint, because multicore technology is just getting started. Jerry Bautista, director of technology management at Intel's Microprocessor Technology Lab, says he has already built prototype chips with eight cores. "Up to eight works well for productivity applications. But thousands of cores are possible. The trick is finding what's practical," he says. While dual-CPU Xeons (with a total of four cores) have been around for a while, this kind of power is finally coming to the masses. Both Intel and AMD plan to introduce quad-core chips. Intel's Kentsfield chip will arrive by year-end, while AMD's, known as K8L, is slated for mid-2007. AMD's 4x4 technology promises a dual-socket system using two Athlon 64-FX chips that will be available in time for the holidays.
The limit for multicore technologies is really a software issue, as programs must be fundamentally redesigned to take advantage of parallel processing on a large scale. In other words, splitting a task across two or even four processor cores is relatively easy, but splitting it into dozens or hundreds of pieces is most definitely another thing altogether.
Still, even with eight cores, expect to see dramatic performance improvements in complex programs, from games to search technologies. Simon Hayhurst, Adobe's director of product management for digital video and audio, says that most of Adobe's video applications already have elements that can make use of many cores, because previous work optimizing programs for hyperthreaded CPUs also works on multicore CPUs. Says Hayhurst, "The beauty of this approach is that we can write one piece of code that is hyperthreaded, which will scale up or down to multiple cores. We can soak up many more cores than are available today."
The great leap in simultaneous processing capability is also likely to improve artificial intelligence. According to Intel's Bautista, "A video game's AI will be indistinguishable from what a person would do," forcing the player to take cover and track opponents organically rather than following an established script. He adds that such intelligence will extend to other applications, as well: "You will be able to search through thousands of photos and videos for people, certain backgrounds, or even specific facial expressions," he says.
Of course, such performance advances will have to be achieved within realistic parameters. Intel's single-core CPUs experienced increasingly serious power-consumption and heat problems, speeding the demise of that architecture. Smaller, more efficient cores will continue to provide a better overall power profile than a single megachip. That's certainly positive news for notebooks, which have historically lagged behind desktop machines in performance due to heat and power constraints.
AMD chief technical officer Phil Hester notes that mobility will be a major driver for the company over the next several years, and that the company's acquisition of graphics purveyor ATI will be key to this strategy. "In the 1980s, the 286 and 386 had math coprocessors separate. Eventually that was integrated into the CPU. The same thing will happen to 3D graphics...in the post-Vista time frame," he says. According to Hester, we can also expect that power management will be improved to the point where someday a device the size of a PDA should be capable of producing a PC-caliber graphics experience.
The major stumbling blocks to more powerful computers, says Bautista, are elsewhere on the motherboard: Memory bandwidth must grow dramatically to keep up with the CPU, and even hard-disk input/output will have to handle faster data transfers. If you want more-realistic online gaming, "even your broadband connection may need to scale," says Bautista.
And what of Moore's Law, which states that the number of transistors on a chip doubles every 18 months (along with CPU processing power)? "It's still alive and well," says Bautista. He also thinks that parallel processing, which splits a workload among many cores, makes it more likely to continue. "Multiple, smaller cores are easier to build, and there's no end in sight as the manufacturing process continues to shrink. All the stars are aligned right now," he says.