NVidia Promises 'Cinematic Computing'
The promise of cinematic computing is coming closer to reality with the announcement of new graphics chips from industry leaders NVidia and ATI Technologies.
NVidia's GeForce 6800 chips were formally unveiled Tuesday on San Francisco's Nob Hill at a "Graphics to Drench Your Senses"-themed party. The daylong event featured flashy demos of upcoming games and graphics effects that leverage the new processor's capabilities, including Microsoft's DirectX 9.0 Shader Model 3.0 feature set.
The new processors are "the biggest graphics generational gap we've ever done," says Dan Vivoli, NVidia's marketing executive vice president.
"We're only seeing the tip of the iceberg in what these programmable GPU [graphics processor units] can do," says Jen-Hsun Huang, NVidia's president and chief executive officer.
ATI is expected to release its own chips that will rival NVidia's later in April.
The DirectX 9.0 technology is a collection of APIs (application programming interfaces) that let games and other multimedia programs take advantage of the full capability of advanced graphics chips. Shader Model 3.0 will allow graphics designers to realistically portray human characteristics such as flesh tones to create some of the most compelling video games and movies yet, says Jon Peddie, president of Jon Peddie Research in Tiburon, California.
"If the graphics are good enough, you don't see the graphics. You get directly involved in the story," Peddie says.
The new chips are a stepping stone on the path to cinematic computing, or the ability to create sophisticated images on an everyday PC. Movie developers for projects such as Finding Nemo require expensive high-performance graphics workstations to create the images for the movie.
NVidia's launch event included a before-and-after demonstration by Epic Games Founder and President Tim Sweeney. He used the company's Unreal game engine to showcase some of the surface and shadow detail enhancements the GeForce 6800 enables. Shadows looked more subtle and realistic, radiating naturally from (and on) game characters, and surface features had a noticeably more realistic texture.
Electronic Arts' Mark Skaags played a clip of real-time renderings from EA's latest development, Lord of the Rings: Battle for Middle Earth. The game showed an impressively cinematic-like level of detail, including enough action on screen to rival a movie.
A number of other game developers also contributed to the launch event, and all agreed that the GeForce 6800 development allows them to offer significantly more dramatic action in their games. Cevat Yerli, CEO and president of Crytek, showed some enhancements that were programmed for the current version of his company's Far Cry. He noted that the 6800's enhanced graphics power allows game "mods" to be quickly developed. That means that current versions of games can easily be enhanced, without having to wait for major version revs.
These kinds of developments will help mitigate a major complaint of gamers: It has historically taken far too long for software to take advantage of advances in hardware.
NVidia added GDDR3 (graphics double data rate) memory and additional floating-point engines to the new generation of graphics chips to reach a much higher level of performance than prior generations could handle, the company says. NVidia indicates that its inclusion of superscalar architecture and a second shader allows it to double pixel operations per cycle. The company says its new engine also allows greater programming flexibility. NVidia also added an on-chip video processor as a dedicated video engine for encoding and decoding MPEG and HD video.
"NVidia has vastly improved performance over the GeForce FX series with the GeForce 6 series GPU," writes Brent Justice, an editor with the hardware enthusiast Web site Hard OCP, in a review of the technology posted on that site.
The new generation of graphics processors is powerful enough to be used as supercomputer processors, Peddie says. Some universities and laboratories are using older generations of graphics chips in supercomputing clusters due to their superior number-crunching ability, he says.
Most users who don't play PC games have no need for that level of performance, but the PC gaming community is willing to pay top dollar for advances in graphics technology that will improve the performance of their gaming experience. Gamers also tend to influence the purchasing decisions of friends and family members who are less tech-savvy, so companies like NVidia and ATI compete ferociously for the top spot in the graphics processor market.
NVidia and ATI also compete to have their technology included in the console gaming market. NVidia had been the supplier for Microsoft's XBox console, but ATI recently won a contract to supply the chips for the next generation of that platform. The two companies are jousting to see which chip will wind up in Sony's PlayStation 3.
Most of the graphics chips will be used in graphics cards for PCs from mainstream companies like Hewlett-Packard and Gateway as well as from PC companies that cater to gamers such as Falcon Northwest Computer Systems and Voodoo Computers, NVidia says.
The cards will also be available for purchase separately, and they are expected to be available in 45 days, NVidia says.
Tom Krazit of the IDG News Service, Yardena Arar of PC World, and Rex Farrance of PC World contributed to this report.