Nvidia GeForce RTX 2080 and RTX 2080 Ti review: Changing the game

These graphics cards are built for the future. What does that mean today?

1 2 3 4 5 6 7 8 9 10 Page 9
Page 9 of 10

Turing tomorrow: Ray tracing and DLSS performance

Now for the tricky part. The very reason the GeForce RTX 2080 and RTX 2080 Ti exist is for a new wave of games—ones powered by ray tracing and augmented by machine learning. A huge amount of the Turing GPU’s die is dedicated to the tensor and RT cores required for those newfangled technologies. But even though Nvidia has more than 25 games lined up to support ray tracing or Deep Learning Super Sampling, few to none of those will be available when the RTX 20-series cards launch.

The first ray traced games won’t be available until about a month later, when the Windows 10 October 2018 updates rolls out with support enabled for Microsoft’s DirectX Raytracing API (which underpins Nvidia’s RTX ray tracing endeavors). Deep Learning Super Sampling leverages the Turing GPU’s tensor cores and doesn’t need the DirectX Raytracing API, but there’s still no word on when games will roll out with DLSS support. Nvidia didn’t respond when we asked. Fortunately, it sent over some technological demos to let us see the new technologies in action firsthand, though we couldn’t take video of the experiences.

First up: The hilarious Star Wars reflections real-time ray tracing demo by Unreal, which debuted at the RTX 20-series reveal. Yes, we ran this demo directly on our own machine. Here’s the trailer:

It proved just as impressive in reality at 1440p and 4K resolution, though not quite as seamless. The demo was created to run at a cinematic 24 frames. You could unlock the frame rate, but it hovered around there on both the GeForce RTX 2080 and RTX 2080 Ti. What did change was the visual quality.

While the video above looks smooth and sharp, running it on my local rig with the RTX cards exhibited some “graininess,” the visual noise associated with real-time ray traced graphics. This Quake 2 video shows an extreme version of the effect. It’s nowhere near as intense in the Star Wars demo—not even close—but it is there. The graininess almost looks like a film grain effect, but it’s more pronounced on the GeForce RTX 2080, which has 56.5 percent fewer RT and tensor cores than the RTX 2080 Ti. (Turing uses RT cores to calculate ray tracing paths, then uses the machine learning-focused tensor cores to de-noise the resulting visuals.)

The noisiness is most noticeable in the white Stormtrooper armor. It makes the edges of that armor appear jagged—like it hasn’t had proper anti-aliasing applied—when it’s on the edge of that armor, against a dark background. The jagged edge effect mostly went away on the GeForce RTX 2080 Ti, and the demo looks beautiful overall despite these criticisms.

You know what doesn’t look good, though? Running the Star Wars demo on the GTX 1080 Ti, which lacks the dedicated ray tracing hardware of the RTX 20-series. It does still run—the DirectX Raytracing API has a fallback path for all DirectX 12-capable GPUs—but horribly. Horribly. It hit between 7 to 13 frames per second, depending on the scene, with extreme graininess and jagged edges. Watching it legit made me nauseous. If you want to run ray traced games right now, you need a GeForce RTX 2080 or 2080 Ti.

lkdfghpzhx 5 Nvidia

The Infiltrator demo.

Nvidia also sent over a pair of demos showing off the potential of its Deep Learning Super Sampling technology. We covered the technology in our Turing GPU deep-dive, but to keep it short, DLSS uses tensor cores infused with knowledge from Nvidia’s Saturn V supercomputer to apply an antialiasing effect similar to (or better than, with DLSS 2x) temporal anti-aliasing, but at a significantly lower performance hit.

How much lower? Nvidia sent along DLSS-equipped 4K versions of Unreal’s Infiltrator demo and the Final Fantasy XV benchmark (which will be made available to the public on September 20). You can run them with either TAA or DLSS active, although DLSS works only with RTX graphics cards imbued with tensor cores. Cue the FRAPS benchmarking tool!

taa vs dlss Brad Chacos/IDG

Whoa. That’s a massive 39 percent performance uptick across the board with no loss in visual quality, and much better experience overall. The Infiltrator demo chugs and stutters with TAA enabled when the hero’s under fire by a chaingun toward the end, for example, and again when he loads the rocket launcher attachment onto his gun. With DLSS enabled, those scenes become perfectly smooth. That universal 39 percent performance uptick is even more impressive when you consider that the RTX 2080 crams in only about half as many tensor cores as the RTX 2080 Ti.

It’s seriously impressive stuff, and I’m very excited to see if this sort of performance improvement proves sustainable outside of canned demos. DLSS is coming to a slew of games, including heavy hitters like Ark: Survival Evolved, PlayerUnknown’s Battlegrounds, Hitman 2, Shadow of the Tomb Raider, and Hellblade: Senua’s Sacrifice. The only question is when.

Next page: Should you buy GeForce RTX?

At a Glance
1 2 3 4 5 6 7 8 9 10 Page 9
Page 9 of 10
  
Shop Tech Products at Amazon