Skip to main content

Nvidia’s DGX A100 system packs a record five petaFLOPS of power

Introducing NVIDIA DGX A100

At its virtual GPU Technology Conference, Nvidia launched its new Ampere graphics architecture — and with it, the most powerful GPU ever made: The DGX A100. It’s the largest 7nm chip ever made, offering 5 petaFLOPS in a single node and the ability to handle 1.5 TB of data per second.

Recommended Videos

Of course, unless you’re doing data science or cloud computing, this GPU isn’t for you. The purpose of the DGX A100 is to accelerate hyperscale computing in data centers alongside servers. In fact, the United States Department of Energy’s Argonne National Laboratory is among the first customers of the DGX A100. It will leverage this supercomputer’s advanced artificial intelligence capabilities to better understand and fight COVID-19.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

“Nvidia is a data center company,” Paresh Kharya, Nvidia’s director of data center and cloud platforms, told the press in a briefing ahead of the announcement. That statement is a far cry from the gaming-first mentality Nvidia held in the old days. Still, Nvidia noted that there was plenty of overlap between this supercomputer and its consumer graphics cards, like the GeForce RTX line. An Ampere-powered RTX 3000 is reported to launch later this year, though we don’t know much about it yet.

The DGX A100 is now the third generation of DGX systems, and Nvidia calls it the “world’s most advanced A.I. system.” The star of the show are the eight 3rd-gen Tensor cores, which provide 320GB of HBM memory at 12.4TB per second bandwidth. And while HBM memory is found on the DGX, the implementation won’t be found on consumer GPUs, which are instead tuned for floating point performance.

Image used with permission by copyright holder

The system also uses six 3rd-gen NVLink and NVSwitch to make for an elastic, software-defined data center infrastructure, according to Huang, and nine Nvidia Mellanox ConnectX-6 HDR 200Gb per second network interfaces.

Each GPU instance gets its own dedicated resources — like the memory, cores, memory bandwidth, and cache. Each instance is like a stand-alone GPU and can be partitioned with up to 7 GPUs with various amounts of compute and memory. Nvidia claimed that every single workload will run on every single GPU to swiftly handle data processing. This provides a key functionality for building elastic data centers. The entire setup is powered by Nvidia’s DGX software stack, which is optimized for data science workloads and artificial intelligence research.

All of this power won’t come cheap. Despite coming in at a starting price of $199,000, Nvidia stated that the performance of this supercomputer makes the DGX A100 an affordable solution. In fact, the company said that a single rack of five of these systems can replace an entire data center of A.I. training and inference infrastructure. This means that the DGX solution will utilize 1/20th the power and occupy 1/25th the space of a traditional server solution at 1/10th the cost.

While the DGX A100 can be purchased starting today, some institutions — like the University of Florida, which uses the computer to create an A.I.-focused curriculum, and others — have already been using the supercomputer to accelerate A.I.-powered solutions and services ranging from healthcare to understanding space and energy consumption.

If none of that sounds like enough power for you, Nvidia also announced the next generation of the DGX SuperPod, which clusters 140 DGX A100 systems for an insane 700 petaFLOPS of compute. This performance is equivalent to thousands of servers.

Luke Larsen
Luke Larsen is the Senior Editor of Computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
Intel’s promised Arrow Lake autopsy details up to 30% loss in performance
The Core Ultra 9 285K socketed into a motherboard.

Intel's Arrow Lake CPUs didn't make it on our list of the best processors when they released earlier this year. As you can read in our Core Ultra 9 285K review, Intel's latest desktop offering struggled to keep pace with last-gen options, particularly in games, and showed strange behavior in apps like Premiere Pro. Now, Intel says it has fixed the issues with its Arrow Lake range, which accounted for up to a 30% loss in real-world performance compared to Intel's in-house testing.

The company identified five issues with the performance of Arrow Lake, four of which are resolved now. The latest BIOS and Windows Updates (more details on those later in this story) will restore Arrow Lake processors to their expected level of performance, according to Intel, while a new firmware will offer additional performance improvements. That firmware is expected to release in January, pushing beyond the baseline level of performance Intel expected out of Arrow Lake.

Read more
You can get this 40-inch LG UltraWide 5K monitor at $560 off if you hurry
A woman using the LG UltraWide 40WP95C-W 5K monitor.

If you need a screen to go with the upgrade that you made with desktop computer deals, and you're willing to spend for a top-of-the-line display, then you may want to set your sights on the LG 40WP95C-W UltraWide curved 5K monitor. From its original price of $1,800, you can get it for $1,240 from Walmart for huge savings of $560, or for $1,275 from Amazon for a $525 discount. You should complete your purchase quickly if you're interested though, as there's no telling when the offers for this monitor will expire.

Why you should buy the LG 40WP95C-W UltraWide curved 5K monitor
5K monitors are highly recommended for serious creative professionals, such as graphic designers and filmmakers, for their extremely sharp details and precise colors, and the LG 40WP95C-W UltraWide curved 5K monitor is an excellent choice. We've tagged it as the best ultrawide 5K monitor in our roundup of the best 5K monitors, with its huge 40-inch curved screen featuring 5120 x 2160 resolution, 98% coverage of the DCI-P3 spectrum, and support for HDR10 providing striking visuals that you won't enjoy from most of the other options in the market.

Read more
Generative-AI-powered video editing is coming to Instagram
Instagram on iPhone against a colorful background.

Editing your Instagram videos will soon be as simple as typing out a text prompt, thanks to a new generative AI tool the company hopes to release in 2025, CEO Adam Mosseri announced Thursday.

The upcoming tool, which leverages Meta's Movie Gen model, will enable users to "change nearly any aspect of your videos," Mosseri said during his preview demonstration. Those changes range from subtle modifications, like adding a gold chain to his existing outfit or a hippo in the background, to wholesale alterations including swapping his wardrobe or giving himself a felt, Muppet-like appearance.

Read more