Skip to main content

Nvidia is renting out its AI Superpod platform for $90K a month

Nvidia is looking to make work and development in artificial intelligence more accessible, giving researchers an easy way to access its DGX supercomputer. The company announced that it will launch a subscription service for its DGX Superpod as an affordable way to gain entry into the world of supercomputers.

The DGX SuperPod is capable of 100 petaflops of AI performance, according to the company, and when configured 20 DGX A100 systems, it’s designed for large-scale AI projects.

Recommended Videos

Despite the company’s marketing take on the new subscription service, affordable is still relative, as Nvidia’s new Superpod subscription still costs $90,000 per month when it launches this summer.

Image used with permission by copyright holder

The Nvidia Base Command Platform will be powered by its DGX computers, and Nvidia is working with NetApp for storage. Nvidia also announced that it is working with Amazon Web Services and Google Cloud for instances within the cloud. The company claims that the hybrid experiences will allow developers to schedule jobs on-premise or in the cloud.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

By relying on the cloud and a subscription model, AI researchers now only need a smaller form factor that’s easier to fit into a server, and Nvidia is billing its new AI service as part of the company’s efforts to democratize artificial intelligence work. In the consumer space, Nvidia is also relying on the cloud to bring the power of its graphics technologies to consumers who may be unable to buy, run, or afford their own discrete GPU setup for gaming through its GeForce Now service.

For reference, Nvidia’s A.I.-powered DGX 2 supercomputer launched at a price of $399,000 and was known as the world’s largest GPU, while the newer, more powerful DGX A100 starts at $199,000 and is capable of 5 petaflops of AI performance.

The company claims this new subscription model allows you “to experience the best of Nvidia software and hardware that’s easy for you to use” without any contractual commitments. It’s designed as a way to test drive Nvidia’s solutions.

Nvidia’s subscription-based model will also include its Bluefield 2 data center processing unit, or DPU, with every DGX.

Image used with permission by copyright holder

“A new type of processor, designed to process data center infrastructure software, is needed to offload and accelerate the tremendous compute load of virtualization, networking, storage, security, and other cloud-native AI services,” Nvidia CEO Jensen Huang said of his company’s DPU earlier this year when the company unveiled its plans for Bluefield 3. “The time for BlueField DPU has come.”

Developers will have access to Nvidia’s AI Enterprise software, which is an open-source stack that the company has integrated into a coherent platform with a special focus on enterprise support. The AI Enterprise software also features deep integration with VMWare’s vSphere. Customers will also have access to Nvidia Omniverse Enterprise software.

A specific launch date was not announced, but the company said that all this will be coming this summer.

In data centers, Nvidia is looking to expand the ARM ecosystem beyond just mobile. Beginning next year, Nvidia announced that it will focus its work on bringing the ARM architecture to data centers. Given that a lot of AI work is already handled by the GPU, Nvidia claims that the role of the traditional CPU will become that of a data orchestrator rather than for heavy compute purposes.

The company hopes to leverage its acquisition of Arm to help transform the data center for AI workloads into an energy-efficient solution that’s just as powerful. Nvidia executives had previously hinted at its ambitions with the ARM architecture earlier this year when it announced its Grace processor.

Be sure to follow Digital Trends for all the latest news from Computex.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
What is MusicLM? Check out Google’s text-to-music AI
MusicLM prompt.

MusicLM is one of Google's experimental artificial intelligence (AI) tools that uses natural language models to interpret your instructions. But instead of chatting to you like ChatGPT, or helping you search, like Bing Chat, MusicLM is an AI that takes what you tell it and creates music based on it.

You'll need to join the waitlist to get access, but once you're in, you can start making music with Google's latest AI tool.

Read more
Nvidia GPUs see massive price hike and huge demand from AI
Nvidia RTX 2060 Super and RTX 2070 Super review

It feels like we’ve only just emerged from the debilitating graphics card shortage of the last few years, but a new report suggests we can’t breathe easy just yet. Could a new GPU shortage be on the horizon, or are consumers safe from a return to another nightmare scenario?

According to DigiTimes (via Wccftech), Nvidia is seeing a huge surge in demand for its chips due to the explosion in artificial intelligence (AI) tools like ChatGPT. Nvidia offers a range of graphics cards that excel at AI tasks, including the A100 and H100, and the company is reportedly struggling to keep up in the wake of such massive demand.

Read more
I’ve seen the (distant) future of AI web search – here’s where it’s amazing, and where it struggles
Bing copilot AI chat interface.

The aggressiveness with which artificial intelligence (AI) moved from the realm of theoretical power into real-world consumer-ready products is astonishing. For several years now, and up until a couple of months ago when OpenAI's ChatGPT broke onto the scene, companies from the titans of Microsoft and Google down to myriad startups espoused the benefits of AI with little practical application of the tech to back it up. Everyone knew AI was a thing, but most didn't actually utilize it.

Just a handful of weeks after announcing an investment in OpenAI, Microsoft launched a publicly-accessible beta version of its Bing search engine and Edge browser powered by the same technology that has made ChatGPT the talk of the town. ChatGPT itself has been a fun thing to play with, but launching something far more powerful and fully integrated into consumer products like Bing and Edge is an entirely new level of exposure for this tech. The significance of this step cannot be overstated.
ChatGPT felt like a toy; having the same AI power applied to a constantly-updated search database changes the game.
Microsoft was kind enough to provide me with complete access to the new AI "copilot" in Bing. It only takes a few minutes of real-world use to understand why Microsoft (and seemingly every other tech company) is excited about AI. Asking the new Bing open-ended questions about planning a vacation, setting up a week of meal plans, or starting research into buying a new TV and having the AI guide you to something useful, is powerful. Anytime you have a question that would normally require pulling information from multiple sources, you'll immediately streamline the process and save time using the new Bing.
Let AI do the work for you
Not everyone wants to show up to Google or Bing ready to roll up their sleeves and get into a multi-hour research session with lots of open tabs, bookmarks, and copious amounts of reading. Sometimes you just want to explore a bit, and have the information delivered to you -- AI handles that beautifully. Ask one multifaceted question and it pulls the information from across the internet, aggregates it, and serves it to you in one text box. If it's not quite right, you can ask follow-up questions contextually and have it generate more finely-tuned results.

Read more