Windows 10's DirectX 12 graphics performance tested: More CPU cores, more oomph

The latest DirectX 12 test shows cores and clock speed matter more than Hyper-Threading--and also causes minor dustup with Nvidia.

Today's Best Tech Deals

Picked by PCWorld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

1 2 Page 2
Page 2 of 2

More cores are better, but clock speed helps too

In many ways, the Ashes of the Singularity benchmark validates my tests of five months ago: With DirectX 12, the more CPU cores, the better. But unlike that early March preview of DX12, clock speeds also seemed to help with the Ashes of the Singularity benchmark. Going from four cores at 1.7GHz to four cores at 3.9GHz gives you a nice bump from 36 fps to 51 fps.

That’s very significant.

Contrast that with what happened in my earlier 3DMark DirectX 12 feature test. I simulated a Pentium G3258 with two cores and no Hyper-Threading running overclocked at 4.9GHz. The result in 3DMark was only slightly faster than when I simulated a 3.5GHz Core i3-4330 with two cores and Hyper-Threading turned on. In the synthetic 3DMark, Hyper-Threading made big contributions to performance. You can read the original story or just peep this chart of my 3DMark testing.

dx12 performance simulated cpus updated PCWorld

Earlier tests of DirectX12 performance using 3DMark’s feature test favored more CPU cores and Hyper-Threading over high clock speeds.

Good news for AMD

In the Ashes of the Singularity benchmark, clock speeds have far more impact than they did in 3DMark's feature test, while Hyper-Threading has a minimal performance impact. Oxide developers told me the reason they suspect Hyper-Threading didn’t knock it out of the ball park on their new game engine is the shared L1 cache design of Intel’s CPUs. 

With Hyper-Threading a yawner and high-clock speeds a big bonus, AMD’s budget-priced chips are pretty much set up as the dark horse CPU to for DirectX 12—if this single benchmark test is indicative of what we can expect to see from DirectX 12 overall, of course.

In fact, Oxide’s developers said their internal testing showed AMD’s APUs and CPUs having an edge since they give you more cores than Intel for the money. AMD’s design also doesn’t share L1 cache the way Intel’s chips do. 

AMD FX

It’s no secret that AMD gives you more cores per buck but that hasn’t mattered in the past with Intel’s more efficient CPUs.

AMD gives you more cores for your money

The numbers really add up when you factor in the cost-per-core from AMD. An AMD FX-8350 gives you 8-cores (with some shared resources) for $165. That doesn’t even net you a quad-core from Intel CPUs. The cheapest quad-core from Intel is the 3.2GHz Core i5-4460 for $180—and that quad-core Haswell CPU doesn’t even have Hyper-Threading turned on. Nor can it be overclocked.

Oxide developers told me their internal testing with the Ashes of the Singularity benchmark showed 8-core AMD CPUs giving even the high-end Core i7-4770K a tough time. 

But don’t take this to mean AMD’s suddenly in the pole position. When I asked Oxide and Stardock officials what the ultimate CPU is for Ashes of Singularity, the choice was Intel’s 8-core monster, the Core i7-5960X.

Still, this is finally some good news for AMD’s CPU division, which all but the most die-hard fanboy would agree has been in third place against Intel’s CPUs for years now. Intel’s CPUs, to be frank, have been so good that they compete more with each other than AMD's counterparts. AMD’s CPUs have failed to trounce their Intel equivalents even when they outnumber them in cores.

As a consumer, you won’t be able to use Ashes of the Singularity to test DX12 performance until the game is released sometime next year—but there is one option if you want to try it sooner. Oxide and Stardock say those who want to play with the test early can buy the $50 Founder's Edition of the game, which will grant early access to the benchmark in a week or so.

But controversy!

All this would seem like a first solid step towards our first DirectX 12 test using a real game. Oxide officials said the reason they let the media demo the test first was to celebrate multi-core CPU gaming, which is finally supported in the new Windows 10-exclusive API. 

Nvidia officials didn’t seem to think the test was all that, though, and in an unexpected move pretty much trashed it as a measurement tool.

“We do not believe (Ashes of Singularity) is a good indicator of overall DirectX 12 gaming performance,” the company said in its guidance on using the new test as a benchmark.

Nvidia said the pre-beta benchmark has bug that affects MSAA performance, which makes it unreliable. To put an even finer point on it, the company’s spokesman Brian Burke told PCWorld: “We believe there will be better examples of true DirectX 12 performance.”

Translated for gamers: That’s pretty much a “shots fired!” moment.

stardock3

Nvidia said a bug in the MSAA support in the new benchmark invalidates it for graphics testing.

Nvidia went on to say to expect the same lead it has had over AMD’s Radeon graphics card drivers in DirectX 11 to carry over to DirectX 12.

“Gamers and press have seen GeForce DX11 drivers are vastly superior to Radeon’s.
We’ve worked closely with Microsoft for years on DirectX 12 and have powered every major DirectX 12 public demo they have shown,” Nvidia said. “We have the upmost confidence in our DX12 drivers and our architecture’s ability to perform in DX12. When DX12 games arrive, the story will be the same as it was for DX11.”

Oxide officials soon fired back in a blog post saying the alleged bug in MSAA is no bug at all, and that Ashes of the Singularity is a perfectly valid DirectX12 benchmark.

”We assure everyone that is absolutely not the case,” said Oxide’s Dan Baker in a blog post. “Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months. Fundamentally,  the MSAA path is essentially unchanged in DX11 and DX12. Any statement which says there is a bug in the application should be disregarded as inaccurate information.”

What’s this all about?

If you can’t read between the lines, let me do it for you: Nvidia just launched preemptive missiles to let anyone who sees tests of a Radeon outperforming a GeForce card by even a little know it’s the test that’s busted, not their drivers. With DirectX 12 as the new undiscovered country for gamers, the company doesn’t want to get off on the wrong foot with the notion that AMD has better drivers.

For its part, Oxide said there’s no reason for anyone to freak—hardware vendors nor gamers. DirectX 12 is a new API and everything is in flux. 

“Immature drivers are nothing to concerned about. This is the simple fact that DirectX 12 is brand-new and it will take time for developers and graphics vendors to optimize their use of it. We remember the first days of DX11,” Baker wrote in the blog. “Nothing worked, it was slower then DX9, buggy and so forth. It took years for it to be solidly better then previous technology. DirectX12, by contrast, is in far better shape then DX11 was at launch. Regardless of the hardware, DirectX 12 is a big win for PC gamers.”

I’d like to point out that Baker is right in some regard. DirectX 12 is a reset for all hardware parties and it’s going to at least a few months—if not years—to determine what hardware will be the best for Ashes of the Singularity and other DirectX 12 games going forward.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
1 2 Page 2
Page 2 of 2
  
Shop Tech Products at Amazon