Skip to main content

Elon Musk reportedly will blow $10 billion on AI this year

Elon Musk at Tesla Cyber Rodeo.
Digital Trends

Between Tesla and xAI, Elon Musk’s artificial intelligence aspirations have cost some $10 billion dollars in bringing training and inference compute capabilities online this year, according to a Thursday post on X (formerly Twitter) by Tesla investor Sawyer Merritt.

“Tesla already deployed and is training ahead of schedule on a 29,000 unit Nvidia H100 cluster at Giga Texas – and will have 50,000 H100 capacity by the end of October, and ~85,000 H100 equivalent capacity by December,” Merritt noted.

Recommended Videos

By the end of this year, Elon Musk's companies (Tesla & xAI) will have brought online roughly $10 billion worth of training compute capacity in 2024 alone.

Tesla already deployed and is training ahead of schedule on a 29,000 unit Nvidia H100 cluster at Giga Texas – and will have… pic.twitter.com/UgvmsBLuQp

— Sawyer Merritt (@SawyerMerritt) October 29, 2024

Tesla also revealed its Cortex AI cluster in August, which will be leveraged to train the company’s Full Self-Driving system and uses 50,000 Nvidia H100 GPUs along with another 20,000 Dojo AI chips developed by Tesla itself. The Colossus supercomputer, which Tesla unveiled in September, uses just as many H100 GPUs as the Memphis and is slated to expand by another 50,000 H100 and 50,000 H200 GPUs in the coming months.

xAI, on the other hand, began assembling its Memphis supercomputer in July at its Gigafactory of Compute, located in an old Electrolux production facility in Memphis, Tennessee. Musk claims that the Memphis is “the most powerful AI training cluster in the world,” as it runs on 100,000 Nvidia’s H100 GPUs, through Musk has promised to double that capacity in short order. It came online in September and has since been tasked with building the “world’s most powerful AI by every metric by December of this year” — likely, Grok 3. xAI has not disclosed how much the Memphis cost to build, though Tom’s Hardware estimates that the company has spent at least $2 billion on GPUs alone.

The $10 billion figure is actually half of what, in April, Musk claimed Tesla would spend this year on AI compute capacity. “Tesla will spend around $10 billion this year on combined training and inference AI, the latter being primarily in car,” he posted at the time. “Any company not spending at this level, and doing so efficiently, cannot compete.”

By that measure, Musk’s AI efforts are already falling behind deep-pocketed rivals like Microsoft, OpenAI, and Google. In July, for example, analysts estimated that OpenAI would spend around $7 billion on AI compute, while losing around $5 billion on other operating costs. However, the company announced in early October that its latest round of investment funding totaled $6.6 billion at a $157 billion post-money valuation. “The new funding will allow us to double down on our leadership in frontier AI research, increase compute capacity, and continue building tools that help people solve hard problems,” the company wrote in its announcement post.

Per a report from Reuters on Thursday, both Microsoft and Meta are spending freely to build out their respective AI compute capabilities. Microsoft is reportedly spending as much capital each quarter as it used to spend annually prior to 2020. The company also reports that its capital spending increased by more than 5% in the first quarter of 2024, to $20 billion, and expects to spend even more in Q2. Meta, on the other hand, has spent as much capital every quarter of 2024 as it did annually until 2017.

As for Google, it reportedly spent $13 billion on capital expenditures in Q3 2024, a 63% increase over the same period last year. What’s more, the company has dropped some $38 billion into compute infrastructure since the start of the year, an 80% jump from the first three quarters of 2023. Suddenly, $10 billion between a pair of companies and a handful of projects seems almost quaint.

Andrew Tarantola
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Few people are using ChatGPT and other AI tools regularly, study suggests
ChatGPT app running on an iPhone.

Not a day seems to go by without generative-AI products like ChatGPT making the news, but few people are actually making regular use of the tools, a new study suggests.

The study was carried out by the Reuters Institute and Oxford University, and it involved 6,000 respondents from the U.S., U.K., France, Denmark, Japan, and Argentina. Researchers found that OpenAI's ChatGPT is by far the most widely used generative-AI tool and is two or three times more widespread than the next most widely used products -- Google Gemini and Microsoft Copilot.

Read more
Reddit seals $60M deal with Google to boost AI tools, report claims
The Reddit logo.

Google has struck a deal worth $60 million that will allow it to use Reddit content to train its generative-AI models, Reuters reported on Thursday, citing three people familiar with the matter.

The claim follows a Bloomberg report earlier in the week that said Reddit had inked such a deal, though at the time, the name of the other party remained unclear.

Read more
2023 was the year of AI. Here were the 9 moments that defined it
A person's hand holding a smartphone. The smartphone is showing the website for the ChatGPT generative AI.

ChatGPT may have launched in late 2022, but 2023 was undoubtedly the year that generative AI took hold of the public consciousness.

Not only did ChatGPT reach new highs (and lows), but a plethora of seismic changes shook the world, from incredible rival products to shocking scandals and everything in between. As the year draws to a close, we’ve taken a look back at the nine most important events in AI that took place over the last 12 months. It’s been a year like no other for AI -- here’s everything that made it memorable, starting at the beginning of 2023.
ChatGPT’s rivals rush to market

Read more