Skip to main content

After reaping the rewards, Nvidia turns its back on cryptocurrency

Nvidia has just shared an interesting opinion on cryptocurrencies. According to the chipmaker, crypto doesn’t “bring anything useful for society.” The computational power of the best graphics cards is better spent elsewhere, says Nvidia.

What’s a better use for Nvidia GPUs? Like many others right now, Nvidia seems to be all about AI, and it appears to have a special interest in chatbots like ChatGPT and Bing Chat.

A cryptocurrency mining rig from a computer graphic card.
Getty Images

As reported by The Guardian, Nvidia seems to be distancing itself from the cryptocurrency sector. Michael Kagan, Nvidia’s chief technology officer, said that powering up AI-related tasks is more worthwhile than using Nvidia GPUs for crypto.

Recommended Videos

The opinion that crypto doesn’t bring much good to the world is shared by many people (and disputed by many, too). However, this is a hot take coming from Nvidia, which has benefited tremendously from cryptocurrency mining. During the height of the last crypto bull market, when mining was still possible, Nvidia undoubtedly reaped the fruits of crypto in the form of GPU sales.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

These days, mining may not be dead, but it’s really difficult to make it worth the time, the money, and the power consumption. This is because Ethereum, which is the second-most popular cryptocurrency, can no longer be mined.

Nvidia never went out of its way to promote using its consumer GPUs for cryptomining, but the graphics cards were still bought out by miners in massive quantities. This was especially a problem during the GPU shortage, when most of the cards ended up either in the hands of crypto miners or in the hands of scalpers who would then resell them at a huge markup.

To its credit, Nvidia did try to prevent crypto miners from grabbing too many of its gaming GPUs. Many of its RTX 30-series graphics cards were made to be drastically less efficient at mining Ethereumdue to the Lite Hash Rate limiter (LHR). The tech constrained the GPUs mining capabilities. However, miners found ways to work their way around it. Besides, during the bull market, mining was profitable even with Nvidia’s LHR, which is why its graphics cards sold out regardless.

Workers transfer cryptocurrency mining rigs at a cryptocurrency farm .
Workers transfer cryptocurrency mining rigs at a cryptocurrency farm that includes more than 3,000 mining rigs in Dujiangyan, in China’s southwestern Sichuan province. STR/AFP via Getty Images

These days, crypto is struggling and miners no longer care for Nvidia’s GPUs, which is why Nvidia may finally feel free to share its stance.

“I never believed that [crypto] is something that will do something good for humanity. You know, people do crazy things, but they buy your stuff, you sell them stuff. But you don’t redirect the company to support whatever it is,” said Kagan.

Kagan sees a lot more potential for good in training chatbots like ChatGPT. Since Nvidia is typically better at handling AI workloads than AMD, many companies turn to its data center GPUs. The first version of ChatGPT was trained with the help of around 10,000 Nvidia graphics cards, and rumor has it that this number will increase exponentially. Microsoft is said to have bought tens of thousands of A100 GPUs for that purpose.

With crypto mining unlikely to make a huge comeback, it seems that Nvidia is looking to the future. Nvidia certainly recognizes the potential of ChatGPT, and unlike AMD, it’s ready in time for the AI boom.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Bad news for AMD? Nvidia might fast-track the RTX 50-series
Two RTX 4060 cards side by side

Things are finally about to start heating up for some of the best graphics cards. Although we're still in the dark about final release dates, both AMD and Nvidia are said to be launching new GPUs in the first quarter of 2025. However, a new leak tells us that Nvidia might try out a different approach with the RTX 50-series, and that's bound to put some pressure on AMD at the worst possible time.

What's new? We've already heard that Nvidia is likely to announce the RTX 5090 and the RTX 5080 at CES 2025, with its CEO Jensen Huang scheduled to hold a keynote during the event. However, the release dates for the rest of the lineup remained a mystery. Now, a previously reliable source sheds some light on the matter with potential details about the planned launch dates for the RTX 5070, RTX 5070 Ti, RTX 5060, and RTX 5060 Ti.

Read more
No, generative AI isn’t taking over your PC games anytime soon
Cyberpunk 2077 running on the Samsung Odyssey OLED G8.

Surprise -- the internet is upset. This time, it's about a recent article from PC Gamer on the future of generative AI in video games. It's a topic I've written about previously, and something that game companies have been experimenting with for more than a year, but this particular story struck a nerve.

Redditors used strong language like "pro-AI puff piece," PC Gamer itself issued an apology, and the character designer for Bioshock Infinite's Elizabeth called the featured image showing the character reimagined with AI a "half-assed cosplay." The original intent of the article is to glimpse into the future at what games could look like with generative AI, but without the tact or clear realization of how this shift affects people's jobs and their creative works.

Read more
25 years ago, Nvidia changed PCs forever
The GeForce 256 sitting next to a Half Life box.

Twenty-five years ago, Nvidia released the GeForce 256 and changed the face of PCs forever. It wasn't the first graphics card produced by Nvidia -- it was actually the sixth -- but it was the first that really put gaming at the center of Nvidia's lineup with GeForce branding, and it's the device that Nvidia coined the term "GPU" with.

Nvidia is celebrating the anniversary of the release, and rightfully so. We've come an unbelievable way from the GeForce 256 up to the RTX 4090, but Nvidia's first GPU wasn't met with much enthusiasm. The original release, which lines up with today's date, was for the GeForce 256 SDR, or single data rate. Later in 1999, Nvidia followed up with the GeForce 256 DDR, or dual data rate.

Read more