ATI Radeon HD 2900 XT
At a Glance
ATI Radeon HD 2900XT
ATI's first DirectX 10 graphics card.
AMD's $449 ATI Radeon HD 2900 XT resides in a product no-man's-land of sorts. Though the card breaks all kinds of new ground for AMD/ATI, on today's games it's merely equal to (and in some cases slower than) cards using the 320MB version of nVidia's GeForce 8800GTS--a less-expensive chip set that nVidia launched a few months ago.
The Radeon HD 2900 XT is the flagship of a new line of ATI boards with top-to-bottom DirectX 10 support, and it's packed with impressive technology, including a 512-bit memory interface; 320 unified stream processors, which can handle any type of shader you throw at them; and high-quality video processing with full encode and decode support for high-definition video.
But the benchmark story isn't particularly compelling: Almost any member of the current generation of graphics boards will handle today's games just fine. Unless you're looking for full antialiasing and 60 frames per second on your 1900-by-1200, 23-inch LCD, you'll be fine with even a midrange ATI or nVidia graphics board. As a result, high-end graphics boards are tough to differentiate.
The Radeon HD 2900 XT lagged just a bit behind the GeForce 8800 GTS cards we've tested. When we bumped Half-Life 2 up to 1600 by 1200 resolution with antialiasing enabled, one significant difference became apparent: The Radeon achieved just 91 frames per second, compared with 116 fps for a comparably priced GeForce 8800 GTS board and 124 fps for a top-of-the-line GeForce 8800 GTX model.
Results in Battlefield 2, Quake 4, and Far Cry showed the HD 2900 XT and the 8800 GTS running neck and neck. In Splinter Cell: Chaos Theory the HD 2900 XT managed to pull ahead a bit: The HD 2900 XT ran the game at 51 fps at 1600 by 1200 resolution with antialiasing, while the GeForce 8800 GTS board managed 45 fps.
The real battleground for these new boards will take shape later this year, when the first DX10 games start to appear. That's when we'll see who has really done a better job of building a unified shader architecture that's ready for the future of 3D gaming.