Nvidia clarifies: No specific GTX 970 driver to improve memory allocation performance planned
Important update 1/29/15: The Nvidia employee who said the company was looking into a GTX 970 driver that would "tune what's allocated where to further improve performance" has updated his post to remove the claim after it was covered by several publications, including PCWorld, PC Gamer, and PC Perspective. As that changes the entire thrust of this article, its headline has been updated to reflect that. We still stand by our recommendation of the GTX 970 and you can read a summary of the memory allocation firestorm here.
The Nvidia GeForce account and Nvidia employees have clarified the situation on Twitter, as well.
@DeadSanto123 We are always improving performance through drivers but there are no plans for an update specifically for the GTX 970.— NVIDIA GeForce (@NVIDIAGeForce) January 29, 2015
@Mauledbyajer Miscommunication on our part. Any improvements we make in our drivers are designed to help all GeForce cards.— Manuel Guzman (@ManuelGuzman) January 30, 2015
This post originally ran at 2:30 PM Eastern on 1/28/13 with the headline "Nvidia plans GeForce GTX 970 driver update for memory performance concerns." The original post follows in its entirety.
There’s trouble a-swirling in graphics land. To make a long story short, Nvidia was recently forced to admit that the way the GeForce GTX 970 handles its 4GB memory allocation is… unorthodox, to say the least. The GPU actually taps into two separate memory pools: A primary, full-speed segment of 3.5GB, and a secondary, far slower 512MB segment. In cases where games need more than 3.5GB of RAM, some users saw stuttering and frame rate wonkiness as the GPU accessed the 512MB segment.
The vast majority of users are unlikely ever to bump into the issue, as it should be a problem limited largely to situations where you’re gaming at extremely high resolutions and/or with anti-aliasing settings cranked. But an Nvidia representative says the company is working to minimize the issue regardless.
Writing in the GeForce forums, an Nvidia employee and moderator going by “PeterS” said the following (emphasis mine):
“It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.”
PeterS provided more detail in a follow-up comment:
“Actually I'm not sure as that's not a simple issue with just one cause. Card memory is not just used for the frame buffer, plenty of driver stuff gets loaded into it as well. We're looking at sticking as much of that stuff as possible into the 0.5GB space to leave the rest available.”
Essentially, Nvidia’s trying to shove all the background crap into the secondary 512MB segment in order to leave as much free space for actual games in the main 3.5GB space. In the meantime, PeterS has offered to help try to get refunds or exchange credits for deeply disgruntled GTX 970 owners, though he cautions that the offer basically means he’ll talk to your graphics card manufacturer on your behalf if they’re giving you a rough time about a return related to memory allocation concerns.
I wouldn’t recommend most gamers jump the gun on that, however. Nvidia messed up here, but the GTX 970 is still a beastly card that offers tremendous bang for your buck, and gaming at 4K resolution—where memory frame buffer issues would be a much larger concern—wouldn’t be very feasible with a single GTX 970 anyway.
Simply put, average gamers aren't likely to push games to 3.5GB-plus of memory usage unless they're doing really unusual things. People who purchased two or more GTX 970s for a high-resolution SLI setup might want to weigh their options, however. (Guru 3D’s initial testing of the issue suggests the actual performance drop when the GTX 970 utilizes more than 3.5GB of RAM is minimal.)
Speaking of options, AMD representatives were quick to try and tempt unhappy Nvidia owners over to Team Radeon.