Skip to main content

Meet the company resurrecting dead celebrities and digitally cloning living ones

Picture the scene. You’re a struggling actor in your twenties, auditioning for the role of a lifetime in a major Hollywood movie. You nailed the first audition, and the casting director has since summoned you back twice to audition again. At the last callback, Steven Spielberg — the movie’s director — was even in the room. Your agent tells you it’s come down to you and one other actor. Then you get the call. The news isn’t good. The other person got it. “Who did they give the role to?” you ask, trying to conceal your abject disappointment. “Let me check,” your agent says, putting you on hold. Her voice comes back on the line. “They gave it to 1955-era James Dean,” she tells you.

Impossible, right? Only, it’s very much not. Anyone who has been keeping their eyes open at the movies for the past few years (and, frankly, why waste the price of a ticket by shutting them?) will have seen the resurgence of certain actors who don’t appear to belong in 2018.

Recommended Videos

The point at which everyone realized that something was going on may well have been 2016’s Rogue One: A Star Wars Story, which dusted off Peter Cushing, the legendary British actor who passed away in 1994, for one more cinematic hurrah. Since then, we’ve seen the digital “de-aging” of Michael Douglas, Robert Downey Jr., and Samuel L. Jackson in assorted Marvel movies, Arnold Schwarzenegger in Terminator Genisys, Orlando Bloom in two of the Hobbit movies, Johnny Depp in Pirates of the Caribbean: Dead Men Tell No Tales, and more. In 2018, the movie industry is more in love with digitally recreating the past than it is in back-patting award ceremonies and power lunches at West Hollywood’s Soho House.

This was not the start of it, of course. In Ridley Scott’s 2000 movie Gladiator, actor Oliver Reed’s scenes were completed using a digitally constructed face mapped onto a body double after the actor passed away during filming. In that instance, however, it was intended as less a feature than a way to finish the movie without having to completely reshoot Reed’s entire performance with another actor. A similar occurrence took place at the end of the second season of HBO’s The Sopranos, after actress Nancy Marchand — who played Tony Soprano’s overbearing mother — died during production. Her final scene in the show is a weird, unsettling mixture of awkward CGI footage and audio pulled from old episodes.

Things have come a long way. No longer a hacked-together workaround, digital recreations of actors are now convincing enough to front million-dollar ad campaigns. In the U.K., the likeness of actress Audrey Hepburn was digitally revived to sit on a bus on the Amalfi Coast, eating a Galaxy chocolate bar. In the States, Dior created a star-studded ad campaign in which Marilyn Monroe, Princess of Monaco Grace Kelly, and Marlene Dietrich appeared on screen with Charlize Theron to hawk perfume. This was, shall we say, the tipping point.

New opportunities arise

It’s into this space that visual effects companies such as Digital Domain have began to carve out a name for themselves. Located in Playa Vista, on the westside of Los Angeles, California, Digital Domain has been working in the digital life business since the 2010s. It has worked on both major Hollywood movies and also in the music industry, where it was famously responsible for bringing the late Tupac Shakur to Coachella in 2012.

You can think of its work a bit like that famous “It’s alive!” scene from 1931’s Frankenstein — only that instead of resurrecting the undead by pulling levers in some underground gothic laboratory, the work is done is done by clicking a mouse a bunch of times in a trendy LA edit suite. At the end of the day, the results are the same, though: all the undead celebrities you could wave a flaming torch at. Or, at least, younger-looking living ones.

“If we miss slight details of the body no one really notices but change a smile by a few millimeters and suddenly it no longer looks like the person.”

“A whole series of technologies are used in the preserving of someone’s likeness or the creation of a deceased celebrity,” Digital Domain employee Darren Hendler, whose official job title is “Head of Digital Humans,” told us. “We use a combination of technologies offered by others, and some developed internally at Digital Domain. The creation of a realistic well-recognized moving human face is one of the hardest challenges in computer graphics today. It requires a wide variety of different technologies to capture all the elements that make up an individual. Our brains process all of this in milliseconds. We focus primarily on the face as it is the key area of the human body that you first take notice. If we miss slight details of the body no one really notices but change a smile by a few millimeters and suddenly it no longer looks like the person.”

This is an important point. The effect of having something slightly “off” about a digitally recreated person is, at best, distracting and, at worst, extremely off putting. This “uncanny valley” effect was first studied by Dr. Masahiro Mori in Japan during the 1970s, initially related to robots. Today, it most clearly applies to digital recreations of human faces — and the results of getting it wrong can be disastrous.

Image used with permission by copyright holder

For instance, in its review of the 2004 movie The Polar Express, CNN noted that the use of CGI to recreate (the very much alive) Tom Hanks digitally did not entirely work. “This season’s biggest holiday extravaganza…should be subtitled ‘The Night of the Living Dead,’” the review read. “The characters are that frightening. This is especially disheartening since there’s so much about this technologically groundbreaking movie…that’s astounding.”

How to create a digital human

There are three elements involved in Digital Domain capturing and creating a digital human. The most obvious of these is, of course, the appearance. In order for an actor to look like, well, themselves, it’s essential to capture the look and shape of their face, their eyes, and their hair. Digital Domain achieves this by using high-end scanners to capture every detail of a person’s face down to the pore level.

“We even capture how blood flow changes the coloration of the skin when it goes into different expressions”

“We even capture how blood flow changes the coloration of the skin when it goes into different expressions,” Hendler explained. “Part of the technology used in this stage allows us to differentiate the way that light interacts with skin, including the look of the skin that absorbs the light and the look of the light that gets reflected off.”

After this comes the equally important facial motion bit: capturing how their face moves and changes expression. This is done using a technology from the company Dimensional Imaging, designed to capture faces in motion. It achieves this by tracking thousands of points on the face alone as it shifts from one expression to another. Using this data, combined with Digital Domain’s own in-house technology, it’s possible to create a model showcasing the unique way that each actor’s skin moves over the underlying muscle structure on their face.

Finally, these two digital elements are then composited onto another actor or stand-in performer, who “wears” the face of the digital thespian to act out the scenes. Just like stand-ins for nude scenes, this means matching up the body type of a target actor with another who broadly resembles them. The head is then mapped onto their body by way of machine learning technology.

As mentioned, it is, of course, possible to recreate actors who were never scanned in their lifetime — although this is tougher and, perhaps, bound to remain less convincing. “In all cases, the creation of a deceased actor without scans or data is much harder than if there were material for the actor,” Hendler said. “Generally, when creating a deceased actor we will find the closest lookalike and scan them as a base. We then modify their appearance to match the actor we are creating which is a slow and very complex procedure. In most cases, we always have some real person as a base and are not creating something from thin air.”

The future is bright, albeit unsettling

So what is the future for this brave new technology? Will tomorrow’s movies feature an all-star lineup of greats, algorithmically calculated to bring in the broadest age demographic of possible viewers? It’s certainly possible, although a large part of this will rely on the public’s response. After all, Peter Cushing’s appearance in Rogue One was not met with unanimous praise from fans. Is this because the effect wasn’t convincing, or because people don’t like the idea of jolting a late thespian back to quasi-life to perform on screen one more time? We’ll have to wait and see — as well as observing the parallel rise of hologram tours at music venues, featuring the likes of the late Roy Orbison and Amy Winehouse.

Either way, it seems this is a technology that both studios and individual actors will want to pursue. After all, imagine the endless source of revenue if, for instance, The Rock was to digitally scan himself so as to continue laying the on-screen smackdown long past the point he can convincingly climb upstairs. These licensing deals could continue far beyond the lifespan of a regular celebrity career. (Although we wonder whether the animators or the actors will receive the first “Best Actor” or “Best Actress” Oscar when this milestone inevitably happens!)

For Digital Domain, things are looking good. “We see this as a huge market,” Hendler said. “The costs are pretty high at the moment to create a digital human that looks indistinguishable from the real person, but those are going down quickly. There is also some hesitation about how the audiences will respond. Sometimes the response has been very accepting, others there has been a bit of backlash. As people become more open to seeing deceased celebrities retaking the screen and costs come down, I am sure you will see this technology all over the place.”

Add in the amazing voice synthesis technology that allows computer scientists to accurately recreate any person’s voice, using a tiny amount of training data, and it appears that the future is bright — if a little bit Black Mirror.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Volkswagen plans 8 new affordable EVs by 2027, report says
volkswagen affordable evs 2027 id 2all

Back in the early 1970s, when soaring oil prices stifled consumer demand for gas-powered vehicles, Volkswagen took a bet on a battery system that would power its first-ever electric concept vehicle, the Elektro Bus.
Now that the German automaker is facing a huge slump in sales in Europe and China, it’s again turning to affordable electric vehicles to save the day.Volkswagen brand chief Thomas Schaefer told German media that the company plans to bring eight new affordable EVs to market by 2027."We have to produce our vehicles profitably and put them on the road at affordable prices," he is quoted as saying.
One of the models will be the ID.2all hatchback, the development of which is currently being expedited to 36 months from its previous 50-month schedule. Last year, VW unveiled the ID.2all concept, promising to give it a price tag of under 25,000 euros ($27,000) for its planned release in 2025.VW CEO Larry Blume has also hinted at a sub-$22,000 EV to be released after 2025.It’s unclear which models would reach U.S. shores. Last year, VW America said it planned to release an under-$35,000 EV in the U.S. by 2027.The price of batteries is one of the main hurdles to reduced EV’s production costs and lower sale prices. VW is developing its own unified battery cell in several European plants, as well as one plant in Ontario, Canada.But in order for would-be U.S. buyers to obtain the Inflation Reduction Act's $7,500 tax credit on the purchase of an EV, the vehicle and its components, including the battery, must be produced at least in part domestically.VW already has a plant in Chattanooga, Tennesse, and is planning a new plant in South Carolina. But it’s unclear whether its new unified battery cells would be built or assembled there.

Read more
Nissan launches charging network, gives Ariya access to Tesla SuperChargers
nissan charging ariya superchargers at station

Nissan just launched a charging network that gives owners of its EVs access to 90,000 charging stations on the Electrify America, Shell Recharge, ChargePoint and EVgo networks, all via the MyNissan app.It doesn’t stop there: Later this year, Nissan Ariya vehicles will be getting a North American Charging Standard (NACS) adapter, also known as the Tesla plug. And in 2025, Nissan will be offering electric vehicles (EVs) with a NACS port, giving access to Tesla’s SuperCharger network in the U.S. and Canada.Starting in November, Nissan EV drivers can use their MyNissan app to find charging stations, see charger availability in real time, and pay for charging with a payment method set up in the app.The Nissan Leaf, however, won’t have access to the functionality since the EV’s charging connector is not compatible. Leaf owners can still find charging stations through the NissanConnectEV and Services app.Meanwhile, the Nissan Ariya, and most EVs sold in the U.S., have a Combined Charging System Combo 1 (CCS1) port, which allows access to the Tesla SuperCharger network via an adapter.Nissan is joining the ever-growing list of automakers to adopt NACS. With adapters, EVs made by General Motors, Ford, Rivian, Honda and Volvo can already access the SuperCharger network. Kia, Hyundai, Toyota, BMW, Volkswagen, and Jaguar have also signed agreements to allow access in 2025.
Nissan has not revealed whether the adapter for the Ariya will be free or come at a cost. Some companies, such as Ford, Rivian and Kia, have provided adapters for free.
With its new Nissan Energy Charge Network and access to NACS, Nissan is pretty much covering all the bases for its EV drivers in need of charging up. ChargePoint has the largest EV charging network in the U.S., with over 38,500 stations and 70,000 charging ports at the end of July. Tesla's charging network is the second largest, though not all of its charging stations are part of the SuperCharger network.

Read more
Juiced Bikes sold at auction for $1.2 million, report says
The Juiced Bikes Scorpion X2 adds more power, upgraded tires, and an improved battery to the popular moped style e-bike.

Juiced Bikes, the San Diego-based maker of e-bikes, has been sold on an auction website for $1,225,000, according to a report from Electrek.Digital Trends recently reported how the company was showing signs of being on the brink of bankruptcy. The company and its executives had remained silent, while customer inquiries went unanswered and its website showed all products were out of stock. In addition, there were numerous reports of layoffs at the company.Yet, the most convincing sign was that the company’s assets appeared as listed for sale on an auction website used by companies that go out of business.Now, it appears that Juiced Bikes’ assets, including a dozen patents, multiple URLs, and the company’s inventory in both the U.S. and China, have been sold at auction, according to the report. It is likely that the buyer, who remains unknown, can capitalize on the brand and the overall value of the 15-year old company. Founded in 2009 by Tora Harris, a U.S. high-jump Olympian, Juiced Bikes was one of the early pioneers of the direct-to-consumer e-bike brands in the U.S. market.
The company had quickly built a reputation for the versatility of its e-bikes and the durability of their batteries. Over the years, the popularity of models such as the CrossCurrent, HyperScrambler, and RipCurrent only bolstered the brand’s status.Last year, Digital Trends named the Juiced Bikes Scorpion X2 as the best moped-style e-bike for 2023, citing its versatility, rich feature set, and performance.Juiced Bikes’ getting sold quickly might be a sign of what consulting firm Houlihan Lokey says is a recovery in the North American e-bike market.
The industry has had a roller-coaster ride during and after the COVID-19 pandemic: A huge spike in demand for e-bikes had combined with disrupted supply chains to create a supply/demand mismatch of “historic proportions," Houlihan Lokey said.

Read more