Skip to main content

Kids born today won’t know what a pixel is, and that’s a dream come true

death of the pixel kill all pixels imac retina
Image used with permission by copyright holder
The pixel is a cultural icon. Wander down any street and you’re likely to see it not only in old LCD screens lining storefronts but also logos, advertisements, and even fashion. “Pixel art,” the intentional use of low-fi, pixelated graphics, is virtually the default among indie games, and even Digital Trend’s own 404 page features 8-bit characters on a pixelated sky-scape.

Users have become accustomed to the pixel, and most have forgotten it’s an artifact of limited graphics technology. The pixel was never desired; it exists only because of the particular limitations of computer displays and graphics renderers. Over the past three decades, countless minds have tried to tame unsightly pixilation, also known as aliasing, in numerous ways.

Recommended Videos

The fight has been long, but the forces of progress are winning. Pixels are dying. It’s entirely possible that a baby born today will never see one, and a child born a decade from now will probably never know of their existence unless he or she decides to learn computer science. Let’s take a moment to reflect on the pixel before it’s laid to rest in the graveyard of obsolescence.

The jagged edge

Pixels have existed from the beginning of computer graphics, and for many early computer scientists, they represented serious problem. While their existence didn’t necessarily hobble interface design for early mainframes and the first home PCs, they presented a major problem for anyone seeking to push the limits of realistic computer graphics.

LucasFilm was an early pioneer in the field. Its computer division, which was eventually sold to Steve Jobs and re-named Pixar, searched desperately for ways to render graphics detailed enough to be used alongside miniatures in Star Wars.

What’s in one pixel could be a city.

Robert Cook, Pixar’s former Vice President of Software Development, was there from the beginning, and remembers the challenge well. “The basic problem,” he explained “is you’re trying to make an image with a lot of detail, and you only have so many pixels.”

This inevitably forces computers to make a difficult decision. Multiple objects might inhabit the space of a single pixel, yet only one can be shown – so which one should it be? “What’s in that one pixel could be a city,” said Cook, “but the computer has to pick one color.” Early computers, with limited pixels and no way to combat aliasing, were forced to dramatically simplify images. The result was coarse, jagged graphics that looked nothing like reality.

Star Wars a New Hope
Foiled by the pixilation of computer-generated graphics, Star Wars’ producers turned to real-life miniatures instead, like this recreation of the Death Star. Starwars.com

Those “jaggies” were particularly nasty in objects oriented at a diagonal to the pixel grid, and they precluded the use of computer graphics for most special effects until the problem was solved.

That proved a long, difficult road. Computer graphics never contributed significantly to the original Star Wars trilogy, which relied on a complicated dance of miniatures up to the Return of the Jedi‘s epic final battle. LucasFilm, refocusing on its core entertainment business and unhappy with the results of the Computer Division, sold it to Steve Jobs in 1986, who renamed the company to Pixar after its star product, a $135,000 anvil of processing power called the Pixar Image Computer.

A new hope

While the Pixar Image Computer was technically stunning, it wasn’t a commercial success, and it didn’t represent the company’s passion. Many of its employees wanted to use computer graphics to create entertainment, even art. This included former Disney animator John Lasseter, who was hired by the Lucasfilm Computer Division to bring life into its technically stunning graphics.

PC users expect razor-sharp image quality and despise softness, even if aliasing is the result.

Everyone knew, though, that even an animator of Lasseter’s skill couldn’t produce a compelling scene from computer graphics if jaggies remained an issue. Pixels don’t appear natural, they obscure the detail of a scene, and in motion they transition perfectly from one pixel to the next, removing the motion blur that makes film seem realistic.

The geeks at LucasFilm tried to tackle the problem in a number of ways. Eventually a hardware engineer at the company, Rodney Stock, came up with an idea, which Rob Cook refined into a fix for aliasing. Randomness.

“The jaggies come from the samples all being lined up on a grid,” Cook explained, “if you add some randomness, you break up the patterns.” Adding randomness to aliased portions of an image introductions uneven noise that, unlike patterns of perfectly stepped pixels, doesn’t seem unusual to the human eye.

Rob Cook (2010, Photo by Deborah Coleman / Pixar)
Rob Cook Deborah Coleman/Pixar

Randomness did more than just serve as effective anti-aliasing. It also helped blend computer effects with film and created a blur effect when applied to multiple frames of motion, addressing numerous problems with one tidy solution. While there were alternative techniques for aliasing in 3D film, they proved too computationally intense and didn’t produce superior results, leaving random sampling to reign as king.

Bringing it home

Solving the problem of pixels on the average home PC is not the same as solving it in film, however. Computer-generated movies are expected to replicate the nature of film, including its imperfections. A little noise or blur is not just acceptable, but desirable.

The Windows desktop, and computer interfaces in general, are a different animal. Users expect razor-sharp image quality, despise softness and frown upon noise. The ideal font is pixel-perfect, high-contrast, fine yet readable, standing out boldly from its surroundings. Even computer gamers expect a very specific experience and often look down on motion blur as an artifact or distracting visual add-on rather than a desirable effect. Games that pay homage to film, such as the original Mass Effect (which implemented a “film grain” filter), catch flack from those who prefer a sharper experience, even at the cost of aliasing. Pixels are preferable to noise.

Mass Effect
Mass Effect Image used with permission by copyright holder

Given the choice, though, users prefer to have the best of both words; razor-sharp image quality and smooth edges. A number of techniques have been developed to deliver this, with varying success. Windows uses Clear Type, a sub-pixel aliasing technology designed specifically for fonts. Apple uses font smoothing along with tight guidelines for the standards of art assets used by developers, particularly with its Retina displays. And games use numerous tactics, from multi-sample anti-aliasing, which only smooths the edges of polygons, to temporal anti-aliasing, which smooths all aliased edges while drawing on data from multiple frames.

These efforts have gradually eroded the pixel’s prominence, making unsightly, jagged edges less common, but they’re not a complete solution. Aliasing is a tough problem to solve, particularly when compute power is limited, as it so often is with home PCs. And there’s always a trade-off between sharpness and smoothness. Turning Apple’s text smoothing up a few notches in the command line can make aliasing very difficult to detect, but it also results in soft, fuzzy fonts that aren’t at all like the crisply printed text in a book.

The visual limit

Anti-aliasing was not the only solution to jaggies considered in the early days of computer graphics. Researchers also looked into rendering images with resolutions as high as 8,000 pixels on a side, which made individual pixels too small for the human eye to detect. Lucasfilm itself commissioned several high-resolution renders of its X-Wing fighter by the graphics group of a company called Information International, Inc. One of these highly impressive renders found itself on the cover of Computer magazine.

Yet this technique was soon abandoned for a number of reasons. It was insanely computationally intense, which meant a single frame effectively cost thousands dollars, and increasing the resolution did nothing to solve the motion blur issue that plagued computer graphics. Though effectively lacking visual pixels, the render didn’t look real and for Lucasfilm, deep in the production of Star Wars, that was an unforgivable sin.

Upscaling low-resolution content is an issue that’ll persist for years.

The failure of early high-resolution renders obscured the usefulness of high pixel counts for decades, but the past five years have brought resolution back to the spotlight. Apple’s first iPhone with Retina sparked the trend, and it’s quickly spread to other devices – for good reason.

Tom Peterson, and Director of Technical Marketing at Nvidia, told us that packing extra pixels really does render them invisible. “As the pixel density gets really high, it reaches the threshold of what the human eye can observe. We call that the visual limit. ” A display that exceeds the visual limit looks less like a display and more like a printed page, albeit one that glows.

What is the visual limit? It’s best described in terms of pixels per degree of vision, a metric that changes based on the size of a display and the observer’s distance. The golden number is 50 PPD, a figure that many modern smartphones easily exceed. 4K monitors don’t quite meet that goal, but 5K displays like the iMac with Retina and Dell UP2715K do, and a 65-inch 4K television can also hit the magic number if viewed from six feet away.

This is not to say that reaching the visual limit immediately eliminates aliasing. Upscaling low-resolution content is an issue that’s likely to persist until pixel-dense displays become the norm. Windows is currently struggling to curtail this issue because it must continue compatibility with numerous applications, some of which may be over a decade old and are no longer actively supported by their developers.

“Applications have some work to do using rendered fonts,” Tom Peterson explained, “because a lot of them are using bitmapped fonts.” They scale poorly, as they are “pixelated text images.” Ideally, fonts should be vector-based, making them a collection of lines and angles that scan scale easily. This is why the text in Windows 8.1’s interface looks brilliant at any resolution, but the text in desk applications often appeared soft and blurred.

Still, this is a solvable problem, and one that developers will be pressured to fix as pixel densities continue to surge. Users who spend hard money on an upgraded display will want to see the benefits, and are sure to avoid software that refuses to modernize.

We’re here today to mourn the pixel

Many new devices have already exceeded the visual limit, and while computers have lagged mobile devices, they’re beginning to catch up. Any 15-inch laptop with a 4K display easily exceeds the limitations of the human eye, and any 27-inch set with 5K resolution does the same. These panels will only decrease in price over time, just as did 1080p; within a few years they’ll be in everyday notebooks and monitors anyone can afford.

That will be the final nail in the pixel’s coffin. With televisions already heading to 4K (and beyond) there will no longer be any device on store shelves without the density needed to render pixels invisible at a typical viewing distance. Resolution itself will start to lose its meaning; users will simply be concerned with whether a display does, or doesn’t, appear as sharp as a photograph. Pixels will fade out of popular knowledge, and further advancements in sharpness will exist only for marketing. Programmers, artists and others who deal with digital images will continue to acknowledge pixels, but most users, even enthusiasts and videophiles, will have little reason to care.

Just as today’s youth can’t remember a world without the Internet, children born five years from now won’t remember a world in which the pixel exists. To them, displays will have always appeared as crisp as a window, and pixel art will be nostalgia for an era only their parents remember.
Their world will not be so different from our own. But it’ll look a hell of a lot smoother.

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
3 games leaving Xbox Game Pass you should play this weekend (December 20-22)
A custom car built drives around Lego 2K Drive.

No more new games are coming to Xbox Game Pass for the rest of 2024. In fact, the service is going to lose some games at the end of the month. All of the games leaving the service on December 31 are very entertaining, and the lineup encompasses wacky racing games, hardcore strategy games about the history of humanity, and goofy fighting games where players control cute animals. These games are worth checking out this weekend as they're leaving Microsoft's gaming subscription service very soon.
Lego 2K Drive
LEGO 2K Drive | Awesome Reveal Trailer | Coming May 19

Racing games are some of the most approachable ones out there, so it makes sense that the genre would be a perfect fit for a Lego game. Visual Concepts and 2K went a step further than they had to with Lego 2K Drive, though, adding large open areas full of missions and minigames to experience. Lego 2K Drive is a light and breezy arcade-like racer that doesn't ask too much from players unless they want to spend a lot of time building vehicles piece by piece. After this game leaves the service at the end of the month, Forza Horizon 5 will be your only option when it comes to open-world racing games on Xbox Game Pass.

Read more
3 new PS Plus games that you should play this weekend (December 20-22)
Frey in Square Enix's Forspoken.

The latest batch of new PlayStation Plus Extra and Premium game catalog additions was released this week. As a result, PS Plus subscribers have a ton of new options when it comes to what to play as we approach what is many people's holiday break. Games from this most recent batch are what I'm recommending people check out this weekend. One of the titles is technically a Christmas game, so it's fitting to play this time of year, while my other recommendations are good options if you're looking for something to play with others.
Forspoken
Forspoken - Official Launch Trailer

Square Enix's Forspoken is a high-profile action RPG console exclusive that was released on PlayStation 5 in 2023. While its quip-heavy dialogue definitely isn't for everyone, the way it incorporates the player's magical abilities into combat and traversal is truly exhilarating. Dashing through large fields, surfing on top of water, and launching bullet-like streams of rocks at enemies is immensely satisfying. Forspoken's Isekai adventure technically begins with the main character, Frey, being whisked away from New York City around Christmas. That technically makes this a Christmas game, giving you all the more reason to check it out this weekend.

Read more
We might get a new Steam Deck next month — and Valve isn’t making it
The Steam Deck OLED on a pink background.

I suspected to see some new handheld gaming PCs this year at CES, but it looks like something even more exciting is in store. AMD and Lenovo are hosting an event during the week of the show, and it'll have two special guests in attendance: Valve's Pierre-Loup Griffais and Microsoft's Jason Ronald.

I'll be attending the event on January 7, about which Sean Hollister over at The Verge initially shared out the details. There are a couple of reasons why this event could be significant. First, Valve. Since the launch of the Asus ROG Ally, there have been a handful of these types of events featuring spokespeople from AMD, Microsoft, and the company making a handheld -- Lenovo or Asus. Valve hasn't ever been in attendance, and considering Valve makes the Linux-based Steam Deck, it would be odd for the company to have a presence.

Read more