Skip to main content

The worst GPUs of all time: loud, disappointing, uninspired

When you look at some of the best graphics cards of today, it’s easy to forget that Nvidia and AMD (and more recently, Intel) weren’t always the only players in the GPU game. While both AMD and Nvidia have committed their fair share of GPU blunders, they’re not the only two brands behind some of the worst GPUs of all time.

Let’s take a look at some of the graphics cards that will make you appreciate the current GPU landscape, and yes, even including cards that are borderline mistakes. (Hello, RTX 4060 Ti.) Here are the GPUs that did it terribly, terribly wrong, even though each had something interesting or innovative to bring to the table.

Recommended Videos

We’re focused on the past here, mainly around brands that have faded from the limelight. Make sure to check out our other roundups for more modern examples:

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Intel740

The Intel i740 graphics card on a table.
Vintage3D

Arc Alchemist wasn’t Intel’s first venture into discrete GPUs, and neither was the DG1. Long before either of those projects came the Intel i740, and it’s a GPU that makes all of Intel’s other attempts look just that much better.

In the mid to late 1990s, Intel jumped on the 3D graphics acceleration bandwagon. The burgeoning PC gaming market was really starting to bring into focus just how much 3D graphics would matter in the future. Perhaps this was what tempted Intel to stray out of its primary domain — which, even back then, was making some of the best processors — and try to make discrete GPUs.

The Intel740, also known as i740, was released in early 1998. It was a 350nm GPU that relied on the now long-forgotten AGP interface, which looked promising compared to PCI (mind the difference — not PCIe) back in those days. In fact, it was one of the first GPUs to utilize AGP, which later proved to play a part in its downfall.

It clocked at a modest 66MHz and had 2-8MB of VRAM across a 64-bit bus. Those specs sound laughable when judged by today’s standards, and even back then, they were a bit off. The amount of VRAM was lower than some of Intel’s competitors were able to provide, but the AGP interface was meant to help; unfortunately, it only served to reduce CPU performance by stuffing the main RAM with textures and taking up the processor’s capacity. The GPU was also affected by this convoluted process.

Despite a lot of hype, the Intel740 fell flat. Although it may have been meant to become a solution for rendering 3D graphics, it sometimes failed to handle them well, delivering artifacts and low visual clarity instead. Rumors of its poor performance were quick to spread. Although Intel mostly targeted pre-built PC manufacturers (OEMs) with this GPU, it only took a short time to be forgotten as gaming enthusiasts knew to stay away from the i740.

The graphics market was very volatile back then and evolved rapidly, so a flop like that must have been a setback for Intel. However, after a couple more attempts at making discrete GPUs, it switched to integrated graphics, where it found success for the years to come.

S3 ViRGE

The S3 ViRGE graphics card.
Retronn

Before we settled into the current landscape of AMD, Nvidia, and Intel, the GPU market had a few more names vying for attention. One such company was S3, which rose to fame very quickly in the early-to-mid 1990s. Much like Intel, S3 capitalized on the 3D graphics boom and designed graphics chips that offered 3D acceleration. In the end, the S3 ViRGE became known as a “3D decelerator,” and is now remembered as one of the worst GPUs of all time.

Upon launch, the S3 ViRGE was marketed as the “world’s first integrated 3D graphics accelerator.” It was, indeed, one of the first such chipsets designed for the mainstream market. It supported around 2MB to 4MB SDRAM across a 64-bit bus and had a core clock of 55MHz. It could render both 2D and 3D graphics and offered resolutions of up to 800 x 600 in 3D. While it did a decent enough job in 2D, it failed to impress in 3D — and that was the whole purpose and marketing scheme for the chip.

When faced with relatively simple 3D rendering, the S3 ViRGE was actually a little bit faster than the best CPU-based solution of those times. However, when it came to the increasingly complex rendering required for 3D games, including tasks like bilinear filtering, the GPU actually proved to be slower than software rendering (which essentially meant using the CPU for graphics purposes). This is what earned it the mocking name of “world’s first 3D decelerator,” because users would prefer to turn off the 3D acceleration and just use the CPU instead.

Word of the chip’s poor 3D performance quickly got around, and the rapid shift from 2D to 3D in the gaming market didn’t help here. S3 attempted to fix what went wrong with future GPUs, such as ViRGE/DX and the ViRGE/GX, but it had pretty fierce competitors in Nvidia, ATI (later AMD), and 3dfx. Ultimately, S3 couldn’t compete in the growing 3D market, although it kept making chips for the midrange segment.

Nvidia GeForceFX 5800

Nvidia's GeForce FX 5800 GPU.
Anandtech

Meet the GeForce FX 5800 Ultra — the first (and only?) GPU that Nvidia made a spoof video about. Yes, Nvidia itself made a two-minute video mocking this GPU, but it wasn’t until after it was released to the market and became known as the “Dustbuster” of graphics cards.

Nvidia had big plans for the FX series. It was meant to be this big leap into the DirectX 9 era, which was a significant transition for PC gaming. This GPU came at a time when Nvidia was already a market leader, although ATI Technologies was closely behind with the Radeon graphics card line. Nvidia’s stumble with the FX series was an unexpected setback, but as we now know, ATI’s/AMD’s dominance didn’t last long and Nvidia now controls the majority of the market, perhaps to the detriment of PC gamers.

The FX 5800 Ultra was manufactured on a 130nm process and clocked at 500MHz (clock and memory, for an effective 1GHz). It used 128MB GDDR2 memory across a 128-bit interface. Nvidia decked it out with the CineFX architecture to enhance cinematic rendering and built it with the plan to make it efficient at DirectX 9 shader processing.

NV30FlowFX - NVIDIA GeForce FX 5800 Fan Noise

On paper, it sounded great. In reality, it was decidedly not. It did well enough in DirectX 8 games but struggled with certain DX9 titles, and ATI’s Radeon 9700 Pro was an enticing alternative that didn’t have the same issues. However, the main problem with the FX 5800 Ultra was the noise.

Nvidia implemented an innovative cooling solution in this GPU called the FX Flow. This was meant to keep the GPU, which normally ran hot, at a comfortable temperature even during heavy gaming. However, the tiny fan that powered the contraption had to run at a really high speed in order to keep up. The result was some of the loudest noise a consumer GPU was ever known to produce.

Nvidia didn’t stick to this cooling model for long. Most of its partners reverted to traditional cooling methods for the FX 5900 XT and 5900 Ultra, and we haven’t seen anything like it ever since.

3dfx Voodoo Rush

Voodoo Rush graphics card.

3dfx was once a formidable rival to Nvidia and ATI. It rose to fame in the early 1990s, and like several other GPU makers of that time, it rode the wave of 3D graphics until it crashed and burned. Nvidia eventually bought most of its assets in 2000. While the company’s decline can’t be attributed to a single card, it had some interesting solutions that ended up failing in the mainstream market, and the 3dfx Voodoo Rush GPU is perhaps one of the most recognized examples.

The Voodoo Rush chipset was a follow-up to the company’s initial product, the Voodoo1. It integrated 2D and 3D acceleration into a single card, pairing the 3D capabilities of Voodoo graphics with a 2D core from other manufacturers. That’s right, I’m talking about a dual-chip configuration here.

The GPU served up 6MB of EDO DRAM, a maximum core clock speed of around 50MHz, and support for things like the Glide API, Direct3D, and OpenGL, as well as a maximum resolution of 800 x 600 in 3D applications. It sounded promising on paper, but once the actual product was out and people could test it, several problems peeked through the hype.

For one, it was a massive GPU with a tendency to heat up, but the main issue lay in the architecture and how it all added up to performance that was often worse than the Voodoo 1 in 3D games. Compatibility problems and visual artifacts weren’t uncommon, and once those issues came to light, reviewers and users alike turned their back on this GPU.

The poor reception of the Voodoo Rush wasn’t what ultimately sealed 3dfx’s fate, though. It went on to produce more GPUs, including the (also controversial) Voodoo 5 6000 which came with its own power adapter. Let’s hope Nvidia doesn’t come up with a similar idea for one of its next-gen behemoth flagships, because the end result was pretty funny to look at.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Rest in pieces: Nvidia is finally ditching GeForce Experience for good
The Nvidia app on the Windows desktop.

We've had the Nvidia app for a while, but now, it's available officially. About a year ago, Nvidia launched the Nvidia app into beta as a one-stop-shop for managing some of its best graphics cards, including grabbing new drivers, messing around with different features, and optimizing your game settings. Now, it's out of beta, officially replacing the legacy GeForce Experience and Nvidia Control Panel apps, and with some new features in tow.

One of the biggest draws of the Nvidia app initially was driver downloads. It may seem mundane, but you'd previously need to download GeForce Experience and create an Nvidia account for GPU driver updates. If you didn't, you'd have to search and install your drivers manually. The Nvidia app gives you access to new drivers, and notifies you when they're ready, all without an Nvidia login. Now, signing in is optional for "bundles and rewards" offered by Nvidia.

Read more
Nvidia’s RTX 40-series is coming to an end
Three RTX 4080 cards sitting on a pink background.

Out with the old, in with the new. According to Board Channels, Nvidia has now halted production for nearly all of its best graphics cards as it shifts focus to the RTX 50-series. Only one GPU remains in production, and some of the cards that are the most in demand are no longer being produced.

Nvidia hasn't officially announced that it's sunsetting the RTX 40-series, but we've been hearing more and more reports that imply that might be the case. The RTX 4090 was among the first cards to go out of production, and the discontinuation appears to have immediately affected the markets. Nvidia's behemoth flagship was hard to come by at the best of times, and now, as no more new units are being produced, it's safe to assume that this situation won't improve. The cheapest RTX 4090 I could find on Amazon costs nearly $2,000, but you can still snag one for .

Read more
We just got our first hint of the RTX 6090, but it’s not what you think
A hand grabbing MSI's RTX 4090 Suprim X.

As we're all counting down the days to a possible announcement of Nvidia's RTX 50-series, GPU brands are already looking ahead to what comes next. A new trademark filing with the Eurasian Economic Commission (EEC) reveals just how far ahead some manufacturers are thinking, because it mentions not just the Nvidia RTX 5090, but also an RTX 5090 Ti; there's even an RTX 6090 Ti. Still, it'll be a long while before we can count the RTX 60-series among the best graphics cards, so what is this all about?

The trademark registration filing, first spotted by harukaze5719 on X (formerly Twitter) and shared by VideoCardz, comes from a company called Sinotex International Industrial Ltd. This company is responsible for the GPU brand Ninja, which doesn't have much of a market presence in the U.S.

Read more