Skip to main content

A.I. helps scientists inch closer to the ‘holy grail’ of computer graphics

Fur Real - Scientists Improve Computer Rendering of Animal Fur

Computer scientists at the University of California, San Diego, and UC Berkeley devised a way to make animals in movies and video games more realistic by improving the look of computer-generated fur. It might not sound like much but the researchers call photorealistic fur a “holy grail” of computer graphics.

Recommended Videos

“Creating photorealistic … characters has long been one of the holy grails of computer graphics in film production, virtual reality, and for predictive design,” Ravi Ramamoorthi, a professor of computer science at UC San Diego, who worked on the project, told Digital Trends. “Realistic rendering of animal fur is a key aspect to creating believable animal characters in special effects, movies, or augmented reality.”

To do so, they leveraged artificial intelligence to better reflect the way light bounces between the fur of an animal pelt, which has a surprisingly significant effect on realism.

Existing models were designed to depict human hair and were less focused on animal fur. However, while human hair and fur both contain an interior cylinder called a medulla, the medulla in fur is much bigger than in hair, and creates an unusual scattering of light. Most existing models haven’t taken the medulla into account in this complex scattering of light.

But the team from UC San Diego and UC Berkeley turned to a concept called subsurface scattering and employed an A.I. algorithm to lend a hand.

“A key innovation is to relate fur rendering to subsurface scattering, which has earlier been used for things like clouds or human skin,” Ramamoorthi said. “There are many techniques to render subsurface scattering efficiently, but the parameters are completely different physically from those used to describe fur reflectance. We have introduced a simple neural network that relates them, enabling one to translate a fur reflectance model to comparable subsurface parameters for fast rendering.”

In terms of speed, Ramamoorthi said his team’s model can generate more accurate simulations ten times faster than the current state-of-the-art models. They shared their findings last week at the SIGGRAPH Asia conference in Thailand.

The new model has future potential in fields from virtual reality to video games, but Ramamoorthi seemed most enthusiastic about its current use for special effects in films.

“Our fur reflectance model is already used, for example in the Rise of the Planet of the Apes, nominated for a visual effects Oscar this year,” he said.

Dyllan Furness
Former Digital Trends Contributor
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
Aptera’s 3-wheel solar EV hits milestone on way toward 2025 commercialization
Aptera 2e

EV drivers may relish that charging networks are climbing over each other to provide needed juice alongside roads and highways.

But they may relish even more not having to make many recharging stops along the way as their EV soaks up the bountiful energy coming straight from the sun.

Read more
Ford ships new NACS adapters to EV customers
Ford EVs at a Tesla Supercharger station.

Thanks to a Tesla-provided adapter, owners of Ford electric vehicles were among the first non-Tesla drivers to get access to the SuperCharger network in the U.S.

Yet, amid slowing supply from Tesla, Ford is now turning to Lectron, an EV accessories supplier, to provide these North American Charging Standard (NACS) adapters, according to InsideEVs.

Read more
Yamaha offers sales of 60% on e-bikes as it pulls out of U.S. market
Yamaha Pedal Assist ebikes

If you were looking for clues that the post-pandemic e-bike market reshuffle remains in full swing in the U.S., look no further than the latest move by Yamaha.

In a letter to its dealers, the giant Japanese conglomerate announced it will pull out of the e-bike business in the U.S. by the end of the year, according to Electrek.

Read more