Though originally crafted as a new feature for TVs, it now seems that 4K (aka Ultra HD) may make its way to consumers more quickly by hitching a ride on desktop monitors and notebook PCs. ASUS, Dell, Toshiba and Viewsonic showed new Ultra HD products at CES, and while they do carry a premium, some, like the $699 Dell P2815Q monitor, are reasonably obtainable.
You may soon find yourself with a choice between 1080p, 1440p, and 4K. Does the higher resolution result in a pixel-dense bliss, or should you wait for the tech to mature?
More pixels
Let’s start with the basics: what is 4K? You might think that it’s a resolution which in some way includes the number “4,000,” but you’d be wrong. 4K is named as such for the same reason Mustangs are sold with a “5.0” badge when the engine is really 4.95 liters: because it sounds better, especially from a marketing standpoint.
The real resolution of a 4K display is typically 3,840 x 2,160. There are some other formats though, like 4,096 x 2,160 resolution which is commonly used for digital cinema projection, but consumer devices almost always use the former format, which can also be called 2160p.
Add the pixels one-by-one, and you’ll end up at an astounding 8,924,400. By comparison, 1080p adds up to 2,073,600, and 1440p comes to 3,686,400. This means that 4K offers more than twice the pixels of the highest-resolution desktop monitors previously available for a Windows PC. Even a MacBook Pro 15 with Retina has “only” 5,184,000 pixels; over three million less!
Packing in the extra dots sends pixel density skyward. The pair of 15.6-inch Ultra HD laptops shown by Toshiba at CES 2014, for example, served up almost 280 pixels per inch. While some tablets already exceed that, they’re also meant to be used more intimately. A large laptop will usually be no closer than two or three feet from the user, so individual pixels become practically invisible.
And that’s the technology’s headline feature. Content designed for 4K looks almost impossibly crisp on an appropriate monitor or laptop. Games hardly need anti-aliasing because the “jaggies” between pixels become almost unnoticeable. And fonts are rendered as if they were printed on paper and then stuck behind the screen.
Mo’ pixels, mo’ problems
More pixels means sharper image quality and a better PC experience overall. So 4k is all good, no matter how you slice it, right?
Wrong.
Here’s the problem; computers don’t work like televisions. When a television is sent a source with a resolution below its native resolution, it will use a scaling algorithm to compensate for the difference. HDTV makers have spent a lot of time researching the best way to accomplish this, and televisions ship with respectably quick processors (usually based on ARM architecture) to deliver fast, high-quality scaling.
Monitors have no such hardware, and just do what they’re told. If a program says its main menu should be 500 pixels tall, you’re getting a menu that’s 500 pixels tall. So, as pixel count increases, that menu becomes smaller, and smaller, and smaller, until it’s so small it can hardly be used. Redmond, we have a problem!
All of the 4K laptops we saw at CES suffered from almost unusably small icons, even with scaling set to the max.
Gamers will also run into problems. Most modern games will launch at 4K, but smooth play is another matter. Any increase in pixel count results in a not-quite-linear, but a still substantial increase in GPU load. Want to play Metro 2033 at 4K and max detail? No problem! You’ll just need four Nvidia GTX Titans. Good luck coming up with that kind of cash, unless you’re Bruce Wayne’s cousin or something.
And then there’s the issue of 4K content. We saw some beautiful demonstration videos at CES, but they were simply that; demos. There are few 4K movies or shows available, and while the infrastructure for it is beginning to develop, you’ll have to wait a few years before content becomes widely available in 4K.
Should you go 4K?
Shoppers in the market for a Windows laptop, particularly one with a screen smaller than 17 to 18 inches, should hold off on getting anything above 1080p if desktop use is important. The Windows Metro interface scales pretty well, but the desktop does not, and old software can be a literal pain to work with if your eye-sight is less than perfect.
Desktop monitors are larger and easier on your eyes, but most users will still need to run Windows at its maximum scaling pre-set, which is 150% of normal size, to feel comfortable. This setting makes icons large enough for people to read, but don’t expect them to look crisp. Some browsers have issues with scaling even on a desktop monitor as well. Google Chrome is particularly bad.
There are some individuals who might want a 4K desktop monitor, however. Anyone who’s looking to edit extremely high-resolution images or video will love the extra space. Hardcore gamers will deep wallets will also enjoy the astoundingly crisp visuals of Ultra HD. Though, as mentioned earlier; expensive hardware is required to play the most demanding games at high detail.
Even if you have a reason to fall for 4K, though, you’ll need to check that your computer can support it. Only recent video cards can output to a resolution this high. Nvidia has a product page that can tell you if your GPU is compatible. AMD does not, so you’ll have to look up your particular card, but generally speaking, you’ll need a mid-range Radeon 7000 series card or better. And only the Intel HD 4000/5000 series can handle Ultra HD; all previous integrated Intel GPUs cannot.
Conclusion
Our recommendation is to wait unless if you have specific need for more pixels. As stunning as 4K monitors may be, we expect them to drop in price significantly over the next year, and current software support isn’t good enough to justify opening your wallet for it just yet.