Skip to main content

I have an RTX 3090, and I still don’t play games in 4K

The push for higher resolutions rages on, and 4K has quickly become the standard for a high-end gaming PC. When I was fortunate enough to find an RTX 3090 in stock, which I knew was a capable 4K graphics card, I knew I had to upgrade my monitor to go with it.

But after using 4K monitor for a few months, I’m already back to 1440p. Here’s why I don’t plan on going back.

Recommended Videos

The RTX 3090 situation

Logo on the RTX 3090 graphics card.
Jacob Roach / Digital Trends

I’ve already drawn the ire of tired gamers hunting for a graphics card during the GPU shortage, but I still need to set the scene. I own an RTX 3090. It’s not a review sample, I didn’t get it for free, and I didn’t buy it through some strange connection. I saved up for nearly a year, waited patiently as my attempts to buy a graphics card were thwarted by scalpers, and eventually waited in line for nearly four hours at a local Best Buy restock.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

I’m lucky, even considering how much time I spent trying to hunt down a graphics card. And although I’d never spend $1,600 on a graphics card normally, that price didn’t seem too bad when stacked up against scalper rates. I consider it an investment — hopefully I won’t have to go through this process again next generation.

The RTX 3090 is the most capable 4K graphics card currently on the market. And even with the RTX 3090 Ti looming, it’ll likely stay that way until the next generation arrives. I spent a lot of money on my RTX 3090, and I wanted to finally upgrade from 1440p to 4K. But even with the most power you can get from a GPU right now, 4K still doesn’t feel worth it.

Get your performance in check

An RTX 3090 graphics card installed inside a gaming PC.
Jacob Roach / Digital Trends

Consoles screw everything up. In the final years of last-gen consoles, 4K gaming was all the rage. Of course, the Xbox One and PlayStation 4 aren’t powerful enough for real 4K, so games made for those platforms use some upscaling method to approximate 4K. Most PC games don’t have that luxury.

There are tools to help with this, such as Nvidia Image Scaling and AMD Radeon Super Resolution, but for the most part, your PC will render every pixel for whatever resolution you select. And 4K is a lot of pixels — around 8.3 million to be exact– while 1440p is significantly lower at 3.7 million and Full HD is just over 2 million.

That’s tough on PCs, even ones packed with an RTX 3090. My personal rig has an Intel Core i9-10900K, the RTX 3090, and 32GB of memory. In my main game, Destiny 2, I hover between 70 and 80 frames per second (fps) at 4K with my RTX 3090. In demanding AAA titles like Assassin’s Creed Valhalla, it just barely skirts past 60 fps.

A lost sector in Destiny 2.
Destiny 2 is my main game, so performance here is what I care about most. Image used with permission by copyright holder

Those results aren’t bad, but they’re not what I expected from what it supposed to be the most performant GPU on the market. Bumping down to 1440p is much more forgiving — 95 fps in Assassin’s Creed Valhalla, and a locked 144 fps to match my refresh rate in Destiny 2. 

Simply put, 4K is still too demanding, even for top-of-the-line hardware. The good news is that you can have your cake and eat it, too. By understanding the pixel density of your monitor, you can improve performance without giving up much visual fidelity.

Forget resolution — talk pixel density

The Eve Spectrum 4K Gaming Monitor.
Niels Broekhuijsen/Digital Trends

Resolution is the name of the game with monitors, but you should always consider it in the context of pixel density, which helps determine how large pixels are physically for a given screen size. Take a 4K 55-inch TV and a 4K 32-inch monitor, for example. Both have the same number of pixels, so the pixels on the 55-inch TV will be larger.

The lower the pixel density, the easier it is to make out individual pixels. We want high pixel density, where the individual pixels are smaller and therefore harder to make out. Resolution and display size scale pixel density oppositely, so you can end up with a similar pixel density at two different resolutions.

In my case, I moved from a 32-inch 4K display to a 27-inch 1440p one. A 32-inch 4K display has a pixel density of about 138 pixels per inch (PPI). A 27-inch 1440p display, meanwhile, has a pixel density of around 109 PPI.

A comparison of common resolutions.
4K includes far more pixels than 1440p, but the density depends on the size on your display. Image used with permission by copyright holder

That’s a bit lower, but remember that 4K equals about 8.3 million pixels, while 1440p only has about 3.7 million pixels — less than half. Sure, my 27-inch 1440p display doesn’t have the same pixel density as my old 32-inch 4K one, but it’s damn close considering the pixel gap between the two resolutions.

Also consider that most TVs have far lower pixel density than monitors. A 65-inch 4K TV only has a pixel density of about 68 PPI. You sit closer to a computer monitor, but the point remains: You really don’t need a 4K monitor for great image quality.

Technical bits aside, the point about pixel density is that it’s absolutely vital to consider the screen size for a given resolution. In the case of my two monitors, the 27-inch 1440p one looks nearly as sharp as my old display because it’s smaller. Understanding pixel density allows you to achieve the image quality you want without just grabbing the highest resolution you can afford.

Still not prime time for 4K

A man games on a PC Gaming Monitor with LED Lights.
Kerkez / Getty Images

Native 4K still puts the latest hardware back in its place. Only a small number of graphics cards can even manage 4K — the RTX 3090 and 12GB RTX 3080 among them — in the most demanding titles, and the trade-off in performance compared to 1440p usually isn’t worth it. Take into account pixel density, and you can have image quality and performance without compromising.

With more than double the pixels as 1440p, 4K still offers more details at the same screen size. That extra detail doesn’t usually matter on common monitor sizes, and for gaming, performance is still king.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Nvidia finally made a tiny RTX 4000 graphics card (but you probably don’t want it)
RTX 4000 SFF going into a PC case.

After months of massive graphics cards like the RTX 4090, Nvidia is finally slimming things down at its GPU Technology Conference (GTC). The RTX 4000 SFF delivers the Ada Lovelace architecture in a tiny package, but you probably won't find it sitting among the best graphics cards.

Although the RTX 4000 SFF uses the same architecture in gaming GPUs like the RTX 4080, it's built for a very different purpose. It uses Nvidia enterprise drivers, and it's made to power computer-aided design (CAD), graphics design, AI applications, and software development, according to Nvidia. The card takes up two slots and includes a low-profile bracket for cases like the Hyte Y40.

Read more
I switched to an AMD GPU for a month — here’s why I don’t miss Nvidia
RX 7900 XTX slotted into a test bench.

AMD's RX 7900 XTX currently tops Digital Trends' list of the best graphics cards. But there's more to a GPU than just performance testing and benchmarks, and some of those things can't be discovered until you live with a piece of tech day in and day out.

So, I figured it was high time to put my proverbial money where my mouth is by using AMD's card every day in my own PC. And a month later, I'm happy to report that aside from some minor hiccups, I don't miss Nvidia as much as I thought I would.
4K flagship performance

Read more
8K gaming monitors: here’s why you shouldn’t expect them in 2023
A side view of Samsung's Odyssey Neo G9 2023.

At long last, 8K is making its way to gaming monitors -- or at least, that's what Samsung's new Odyssey Neo G9 2023 is pushing toward as it ushers in a new generation of gaming displays. But 8K isn't new in itself -- it's been in the mainstream gaming conscience for nearly three years with GPUs and consoles -- so why haven't we seen more 8K gaming monitors?

As an era of next-gen displays starts to emerge, 8K feels like it should become the next big gaming destination. But don't buy the hype. It'll be a long time before 8K gaming truly catches on, and there are a few big reasons why.
What we have now

Read more