Nvidia announced the upcoming release of the Jetson Orin Nano, a system-on-module (SOM) that will power up the next generation of entry-level AI and robotics, during its GTC 2022 keynote today.
Nvidia says this new version delivers an 80x increase in performance over the $99 Jetson Nano. The original version was released in 2019 and has been used as a bare-bones entry into the world of AI and robotics, particularly for hobbyists and STEM students. This new version looks to seriously up the power.
A system-on-module (also referred to as computer-on-module) features a single board with a microprocessor. It also has memory and input/outputs (IOs), and usually has a carrier board. It’s not the same thing as a system-on-a-chip (SOC), mind you — an SOM is board-based and may have the space to include extra components; it could even include an SOC. In short, an SOM is a ready-to-use computing solution, but it’s not a full computer.
With the technicalities out of the way, let’s talk about Nvidia’s latest development, the Jetson Orin, arriving with six Orin-based production modules that were made to handle AI and robotics applications at an affordable price. Among them is the Nvidia Jetson Orin Nano.
Despite being the smallest form factor Jetson SOM, the Jetson Orin Nano can handle up to 40 trillion operations per second (TOPS) of AI-related tasks. The performance hits new heights with the AGX Orin, serving up 275 TOPS in order to handle advanced autonomous machines.
Nvidia’s Jetson Orin comes with an Ampere-based GPU, an ARM-based CPU, and multimodal sensor support. It’s also fully compatible with Nvidia’s Orin NX modules, including full emulation support that will enable Nvidia’s customers to design around multiple Jetson modules. Other perks include support for multiple concurrent AI application pipelines, complete with fast inputs and outputs.
The Jetson Orin Nano modules will be available in two variants, one with 8GB memory and up to 40 TOPS, and one with 4GB memory and up to 20 TOPS. In terms of power consumption, the SOM needs next to nothing: The former requires between 7 watts and 15 watts while the latter only needs 5 watts to 10 watts.
Nvidia foresees that the modules will be used by a wide variety of customers, from engineers dealing with edge AI applications to robotics operating system developers. The low price point, starting at just $199, will make this technology more accessible to a wider range of users. Nvidia cites Canon, John Deere, Microsoft Azure, and more as early adopters of Jetson Orin Nano.
“With an orders-of-magnitude increase in performance for millions of edge AI and ROS developers today, Jetson Orin is the ideal platform for virtually every kind of robotics deployment imaginable,” said Deepu Talla, vice president of Nvidia’s embedded and edge computing division.
Nvidia claims that the Jetson Orin will offer an 80x increase in performance over the previous generation of Jetson SOMs. That’s a massive step up at a reasonable price. The modules will be available starting in January 2023.