Skip to main content

You can now moonwalk on the moon with Nvidia’s A.I. and ray tracing tech

Apollo Moon Landing with RTX Technology; courtesy of Nvidia Image used with permission by copyright holder

In addition to launching 10 new RTX Studio laptops targeting creators at SIGGRAPH, Nvidia also announced some of the work that its research teams have been doing in the fields of artificial intelligence, augmented reality, and computer graphics. On the 50th anniversary of the Apollo 11 lunar landing, Nvidia showcased how ray tracing technology from its RTX graphics card is used to visually enhance the images captured by NASA 50 years ago. At SIGGRAPH, Nvidia’s effort to commemorate Apollo 11 goes a step further, allowing fans of space the opportunity to superimpose themselves into a short video clip, as if they were astronauts Neil Armstrong and Buzz Aldrin, by using A.I. and the power of ray tracing to render these videos in real-time.

“What we’re doing is using artificial intelligence to aim a camera and people just in their street clothes, to be able to do 3D pose estimation,” Rob Estes, Nvidia TKTK explained. “We know where they are in 3D space, and we know how their limbs are moving, So we’re drawing that using ray tracing and placing you in with your movement as an astronaut in the scene.”

Recommended Videos

Doing the moonwalk on the moon walk

Hollywood visual effects directors have been doing something similar for years using a green screen and motion-capture actors wearing suits with dots to replicate limb movement, but ray tracing and A.I. take this a step further by eliminating the need for specialized equipment and actors. “You could do this anywhere you can film somebody,” Estes elaborated on the benefits of A.I.-enabled ray tracing in this lunar example. Essentially, SIGGRAPH attendees can film a short video montage of themselves in Aldrin’s and Armstrong’s spacesuit doing the moonwalk as if they were on the moon as part of the Apollo 11 mission.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

“We’re going to have a mock-up of the lunar surface and the lunar lander, and we’re going to get to let them see what it would be like for them to be on the moon.,” he said, with the all the lighting effects and rendering performed in real-time. “This is very leading-edge research, and nobody has done this before. You’re combing A.I. and ray tracing in a way that has many, many practical benefits.” These benefits not only aid Hollywood and its visual effects teams but also designers and researchers trying to solve hard problems with research estimation, Nvidia expanded.

Foveated rendering with prescription glasses

Image used with permission by copyright holder

Nvidia wants to make rendering for augmented and virtual reality applications appear more realistic and crisp, and it is applying its foveated rendering technology to accomplish this. What the team of researchers at Nvidia has done is added support for prescription lenses, a first for the industry. Though this is still in the early research stage right now, Nvidia envisions a day where wearers of prescription eyeglasses won’t need to wear separate glasses from their augmented reality devices.

Nvidia is working with several types of display to be able to build the VR or AR display congruent with the prescription glasses that you may have. “This is a big deal,” Estes said. “You’ve never been able to see 20/20 before because there just wasn’t the visual acuity for these displays.”

In the same way that ray tracing can bring life-like cinematic effects to video games, foveated rendering can make AR scenes and images appear more realistic with better resolution by conserving graphics power. Rather than rendering the entire scene, foveated rendering allows creators to just render the middle part of the scene in high fidelity, which is where your eyes typically focus, and the peripheral areas can be rendered in less detail to save on GPU power.

“So we’re doing work to make sure that we’re tracking where your game is, and applying this so that you get faster frame rates and better graphics in doing augmented reality, or this could be applied to virtual reality as well,” said Estes.

GauGAN, the A.I. artist, is freed

Nvidia GauGAN Researchers Photo
NVIDIA

In the past, Nvidia showed how its GauGAN drawing tool can turn even the art-challenged among us into artists by allowing you to draw complex, life-life landscapes with just a few simple strokes by leveraging the power of artificial intelligence. The A.I.-drawn scenes, Nvidia revealed, have been used in some big-name Hollywood movies as backdrops as well, with studios overlaying elements that they wanted to set the overall mood for the clip.

Rather than using A.I. to identify elements of a scene, like a dog or a cat, GauGAN, as its name implies, uses a generative adversarial network, or GAN. By specifying where the sky is by drawing a horizon line and where the ocean meets the sky, GauGAN begins to “draw” the scene with clouds and waves, and the user can add in rocks and sea cliffs to this seascape, rendering the scene with proper shadows and reflections. So instead of identifying objects in an image, GAN helps to fill in the image with a realistic creation and rendering of the scene, Nvidia explained.

Given the popularity of the tool, Nvidia revealed that it will be expanding access to GauGAN to everyone. At this time, Estes said there are no plans to monetize GauGAN, despite its creative work in the movie industry, and that the company is simply just relishing the joy that other people get from using it.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
AMD may finally have an answer to Nvidia’s dominance
RX 7900 XTX installed in a test bench.

Although AMD makes some of the best graphics cards, there's one thing it has always struggled with when compared to Nvidia: ray tracing. Even though RDNA 3 brought some improvements in that regard, Nvidia remains the king of ray tracing. However, a new leak about the upcoming RDNA 4 generation tells us that things might be about to change -- and the PlayStation 5 Pro might already benefit.

The scoop comes from Kepler_L2 on X (formerly Twitter), who shared a list of ray tracing features that are being added with RDNA 4. The leaker also claims that most of these should be available in the PS5 Pro, which won't run on RDNA 4 entirely, but is said to utilize parts of it to supplement the RDNA 3.5 chip that it'll house.

Read more
I tripled my frame rate with one button — and you can too
Forza Horizon 5 running on an Asus gaming monitor.

Lossless Scaling, a $7 Steam utility that promises increased performance on PC, just received a massive update. The new 2.9 version adds a mode that can triple your frame rate in games, all with a single mouse click.

I've written about Lossless Scaling previously, but this update is a big one. The utility gives you upscaling and frame generation for any game, and on any GPU. It includes a variety of different upscaling utilities like AMD's FidelityFX Super Resolution 1 and Nvidia's Image Scaling, but frame generation is where the app truly shines. The app includes its own AI frame generation algorithm that inserts new frames between those already rendered.

Read more
Nvidia just hinted at AI that can play games for you
A screenshot from Nvidia's G-Assist April Fool's video.

"The future is never far away." That's the comment Nvidia made yesterday on social media about an old April Fool's joke about AI in games.

Here's the context: In 2017, Nvidia made a silly April Fool's joke video about something called GTX G-Assist, a dongle that could add some outlandish AI features to gaming. Then again, what was outlandish in 2017 feels possible seven years later.

Read more