Skip to main content

We tested Nvidia’s RTX 2080 and 2080 Ti. Are they a worthy upgrade?

A new line of graphics cards only comes every couple of years. But cards as ambitious as Nvidia’s RTX 2080 and 2080 Ti? They’re a rare bird indeed.

After an entire year of anticipation, we finally have these two new, powerful GPUs loaded into our systems. But the first thing we wanted to test wasn’t the fancy, ray tracing abilities or AI-powered anti-aliasing. No, no. Here’s the question we sought to answer: Do they actually deliver a substantial improvement on performance worth their high price tag? The answer may surprise you. And disappoint you.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

On your marks

For all of our initial testing, we wanted to use the same system we test every other graphics card on — and that’s our monster 12-core Threadripper 1920X system, which includes 32GB of RAM and an Asus 4K gaming monitor. We popped in the GeForce RTX 2080 and 2080 Ti, both Founders Edition straight from Nvidia, and got right to benchmarking.

We started our tests with 3DMark, again, to really get an apples-to-apples sense for how these graphics cards compare with others. The immediate results? Well, we can say with certainty these new cards are faster than the previous generation, and the RTX 2080 Ti is definitely the most powerful graphics card ever made. With a score of 20,210, that’s the highest score for a single GPU we’ve ever recorded.

[infogram-responsive id="722592e1-b8f4-4ba3-a2e9-9b9a656e72c5" title="Nvidia RTX 2080 and 2080 ti 3DMark"]

But as the numbers poured in, we started to see what would be confirmed through game tests later: This just isn’t the performance spike Nvidia boasted of. 

If we take a look at the performance jump from the GTX 980 to the GTX 1080, it’s an impressive 33 percent increase in Fire Strike. Going from the 1080 to the 2080 we see only an 11 percent increase, which places it behind the 1080 Ti in terms of ranking. With the significant increase in CUDA cores, faster GDDR6 memory, redesigned cooling solution, increased price, and Nvidia’s hype machine at full tilt, we expected a bit more of a jump in performance over its predecessor.

That said, the CPU you twin with your RTX card can make a big difference. When we tested the 2080 with an Intel Core i9-9900K and an AMD Threadripper 1920X, we noted an eight percent improvement in 3DMark Fire Strike using the Intel CPU. If you’re going to spring for one of these new cards, it may make sense to upgrade your processor too, to make sure you’re getting the most from it.

Fortunately, the 2080 Ti fares a bit better. With a 23 percent increase over the GTX 1080 Ti in 3DMark, that’s closer to what we’d expect in a standard generational jump. Unfortunately, you’ll have to pay up for that increase. The 2080 Ti carries a $400 premium at $1,200 over the 2080’s $800 price tag, and that’s $500 more at launch than the 1080 Ti cost when it was introduced. 

In-game testing

Our standard suite of test games includes Civilization VI, Battlefield 1, Deus Ex: Mankind Divided — and Fortnite, for kicks. 4K is still the holy grail when it comes to gaming, but we started out our testing in 1440p to see how the 2080 and 2080 Ti could handle lower resolutions.  If you’re playing games in 1080p or lower, no need to drop this much cash on a graphics card. A GTX 1060 or 1070 should do you just fine.

The framerates we got lined up well with the precedent set by the 3DMark scores. In terms of game performance, the RTX 2080 comes in above the GTX 1080 — and just behind the 1080 Ti. In 1440p, games like Battlefield 1 look beautiful and play smoothly on the RTX 2080, and it will even take full advantage of your high-end 144Hz panel. Same story for Fortnite, where we averaged 142 frames per second. That’s around a 20-25 percent increase over framerates delivered by the GTX 1080, but a bit behind what the GTX 1080 Ti can pump out. It’s not the massive leap forward we’d hoped for, but it’s on par for a new generation of GPUs. 

[infogram-responsive id="76748f16-ae31-47c3-8e8f-e83ad15bf9d9" title="Nvidia RTX 2080 and 2080ti gaming performance"]

When jumping up to 4K, we are happy to report the RTX 2080 can handle almost every game this resolution with settings maxed. Civilization VI, Battlefield 1, and Fortnite easily cleared the 60 FPS hurdle, and that’ll be indicative of most modern games you currently play. Deus Ex: Mankind Divided presented a bit of a stumbling block in 4K, but it’s an outlier in terms of how it’s optimized.

Again, those are positive numbers. They just don’t quite match the hype of Nvidia’s CEO and marketing team — and thanks to the price of the card, certainly don’t bring 4K gaming access to the masses. 

In a second-round of testing we compared how the RTX 2080 performed when paired with different CPUs and although there was a difference between a system running an Intel Core i9-9900K  and an AMD Threadripper 1920X, the 2080 performed well in each case. It managed more than 60 FPS in Assassin’s Creed: Odyssey at 1440p with Ultra detail settings, and even in the always-intensive Deus: Ex Mankind Divided, at 1440p with High settings, it could reach as much as 90 FPS on average.

The RTX 2080 Ti, on the other hand, has some raw power that we haven’t seen before. We’re seeing a similar 20 to 30 percent framerate increase in games like Deus Ex and Battlefield 1. It’s a bit more exciting when you’re stepping into uncharted territory.

Regardless of the resolution, the 2080 Ti has a ton of power at its disposal. Games like Battlefield 1 and Fortnite feel nearly wasted, tapping out over 150 FPS. When you see some of those beautiful environments in Battlefield 1 rendered in brilliant, smooth 4K, it’s hard to be disappointed.

If you’ve been waiting around for a graphics card to properly match that huge, 4K monitor sitting on your desk, the 2080 Ti is as close as you can get. It still sank under 60 FPS during Deus Ex with an average framerate of 49 FPS, but this chip makes nearly every game we tried look like a walk in the park. The same could be said of the 1080 Ti, which was already a very powerful chip, but the 2080 Ti takes it one step further.

But should you buy one?

There are still larger questions about how these cards will perform with RTX enabled. It’s hard to imagine you won’t see a somewhat significant dip, given the amount of extra processing that has to happen. 

In the meantime, the RTX 2080 and 2080 Ti are a bit hard to make a conclusive call on. A lot of the potential that lays dormant in these GPUs can’t be fully taken advantage of yet. Nvidia isn’t the first tech company to ask its fans to buy into a dream before its time, but for a dramatic price increase over what its new GPUs have debuted at in the past, it’s asking a lot.

A lot of the potential that lays dormant in these GPUs can’t be fully taken advantage of yet.

The RTX 2080 Ti, in particular, had some noteworthy performance gains that make it a really solid upgrade. But the RTX 2080 is a bit harder to recommend, even if you pair it up with a solid new CPU to take full advantage of its performance. The 1080 Ti currently sells for the same or even as much as $100 less in some cases, making it an equally viable purchase.

In terms of ray tracing and new AI capabilities, these might be the most advanced graphics cards ever made. They might be the foundation for an entirely new era of game visuals. We’ve seen the demos, and there’s no debating it’s impressive. Who knows? Maybe every game in 2025 will be RTX-enabled, and you’ll be happy you sprung for the 2080 rather than the 1080 Ti — seven years later.

To sum it up: If you want the absolute best performance that can be had today and have bought into Nvidia’s vision of the future, the 2080 Ti is your best bet — although there are some concerns over its reliability. Otherwise, your gaming rig and the games you own now will be better served by an upgrade to a last-generation card like a 1080 or 1080 Ti. In fact, one of the best things about the introduction of this next generation might just be that last gen’s cards are more affordable than ever.

Luke Larsen
Luke Larsen is the Senior editor of computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
Wait, what? Nvidia’s RTX 50-series might be ready to go this month
RTX 4090.

Update: According to MEGAsizeGPU on X (formerly Twitter), the following might be a mistranslation. The source reportedly talks about Nvidia finalizing the design for Blackwell in September instead of launching the card. No details about the release date have been confirmed. The original article follows below.

Nvidia's RTX 50-series graphics cards are coming -- we know that for a fact. But when? The release dates of these graphics cards have been the topic of much speculation. Early leakers predicted that they'd launch in late 2024, but the general consensus slowly shifted toward an early 2025 release instead.

Read more
The RTX 5090 might decimate your power supply
Fans on the Nvidia RTX 3080.

If you thought the best graphics cards already drew a ton of power, you're in for a rude awakening. A series of claims surrounding Nvidia's upcoming RTX 50-series GPUs say that the next-gen cards will push power limits even further, with a flagship card like the RTX 5090 drawing as much as 600 watts.

Nvidia has yet to even announce RTX 50-series GPUs, but we've already seen some troubles with the Blackwell architecture the cards will use in the data center. Official details on the cards are few and far between, but a handful of sources now claim the RTX 5090 will push power limits beyond the 450W we saw with the RTX 4090 in the previous generation. The most recent speculation comes with well-known leaker kopite7kimi, who claimed on X (formerly Twitter) that the RTX 5090 will go up to 600W, while the RTX 5080 will require 400W.

Read more
Nvidia is launching a new GPU, but read this before buying
Two RTX 4070 graphics cards sitting side by side.

Nvidia is launching a new version of one of its best GPUs -- the RTX 4070 (non-Super). Normally, this would be great news. However, there's one downside: Nvidia is equipping the new RTX 4070 with GDDR6 memory, which is a step down from the GDDR6X that the GPU usually comes with. This wouldn't be the end of the world, but there are a couple of red flags.

For starters, Nvidia itself lists the GPU alongside its previous version. It seems like the two RTX 4070s will only differ by memory type, all the while retaining the same clock speeds, bus width, and CUDA core counts. This is good news, but switching from GDDR6X to GDDR6 means a drop in bandwidth.

Read more