Skip to main content

The newest phones use a time-of-flight camera, but what is it? We ask an expert

LG G8 ThinQ
Julian Chokkattu/Digital Trends

The race to improve and differentiate smartphones has been focusing more and more on the camera suite and all that it offers. There’s a good chance that you’ve heard the recent buzz surrounding time-of-flight cameras, as manufacturers are starting to put them into phones. But what exactly is it?

Time of Flight

Time-of-flight (ToF) cameras are comprised of a sensor that uses a tiny laser to fire out infrared light. This light bounces off anything or anyone in front of the camera and back into the sensor. The length of time the light takes to bounce back is measured and that translates to distance information that can be used to create a depth map.

Recommended Videos

“Time-of-flight cameras are actually measuring the time it takes for light to go from the camera to the user or environment and be reflected back to the sensor,” Dr. Xavier Lafosse, commercial technology director of Corning Precision Glass Solutions, told Digital Trends.

Corning

Up until now, most phones have relied on stereovision, which employed two cameras to calculate rough depth, but this method doesn’t work in low light or in the dark and it’s not very accurate.

A better method that also employs infrared is structured light illumination where a dot pattern is projected onto a scene or face and the sensor measures the distance between the dots and looks at the distortion in the pattern to calculate depth. This technology works well in the short range, up to arm’s length, for things like facial recognition, which is why Apple employed it with its TrueDepth Camera for Face ID.

Time of flight works in a similar way, but it doesn’t use a pattern of dots. Because these methods rely on infrared light, they work well in low light and even dark environments. The time-of-flight camera illuminates the scene with a homogenous flood of light and the camera looks at every individual pixel in the image. The sensor synchronizes with an incredibly sensitive clock that’s capable of measuring tiny variations revealed by the speed of light bouncing back. With depth information assigned to every pixel you get a rich depth map.

“It’s the only method that’s really accurate at measuring distance.”

“It’s the only method that’s really accurate at measuring distance,” Dr. Lafosse said. “It’s the only one that’s not an interpolation or a calculation, but rather a measure of distance.”

There are various different potential applications for accurate depth mapping like this, which is why it’s creeping into more phones. You’ll find a time-of-flight camera in the LG G8 ThinQ and in the Honor View 20, to give two recent examples, but the implementation is different.

LG has paired the time-of-flight sensor with its 8-megapixel front-facing camera to create what it’s calling the Z Camera — the Z-axis denotes depth for 3D images. This enables facial unlock and something called Hand ID, another secure biometric, which reads the vein patterns in your hand. It’s also used for Air Motion gestures, allowing you to wave your hand over your G8 ThinQ to trigger various actions such as playing and pausing music without touching the device.

LG G8 ThinQ
Julian Chokkattu/Digital Trends

In the Honor View 20 the time-of-flight camera is paired with a 48-megapixel sensor as part of the main, rear-facing camera. It lends depth information that enhances portrait mode, creating a really accurate bokeh effect with the subject in sharp relief and the background blurred. But that’s not all it can do as part of a phone’s main camera.

“The value of time of flight is really about the midrange to long range distances; think about applications like augmented reality,” Dr. Lafosse said. “If the time-of-flight camera is on the back, then you know it’s not about facial recognition, but about sensing the environment and what you have in front of you.”

You may have tried out some augmented reality apps and games in the past, but time-of-flight cameras can dramatically boost the accuracy and fuse your actual environment with gameplay and characters for a whole new level of experience. That may be shooting zombies in your hallway or seeing how a piece of furniture you’re thinking about buying would really look in your living room.

There’s potential for deeper social interactions as well. Instead of FaceTime with Animojis, you might have a more fully realized 3D experience.

“Your friend’s avatar isn’t floating in the air, but actually sitting on your couch next to you.”

“Time-of-flight sensors can map your environment accurately, so your friend’s avatar isn’t floating in the air, but actually sitting on your couch next to you,” Dr. Lafosse said.

The underlying technology isn’t new. There was a time-of-flight camera in Microsoft’s Kinect sensor and the military has been using time-of-flight technology to get depth information for many, many years. But Dr. Lafosse said that improvements in the technology have allowed integration of the required elements into ever smaller form factors, and new applications for it are driving its adoption in phones.

This technology is also vital for augmented reality or mixed reality wearables — like Microsoft’s HoloLens or Magic Leap — to work because these systems need a very accurate picture of your environment.

Microsoft

Another area where time-of-flight cameras could help is indoor navigation. If there’s a 3D map of your building in the cloud, then the sensor could potentially recognize precisely where you are at any given moment.

Why does Corning know all this? The company makes the glass that protects most smartphones, and it’s also working on the optical components of time-of-flight cameras — making them smaller, more transparent, and ensuring they perform as well as possible — while manufacturers like Sony continue to improve the sensors, making them smaller and more power efficient. We are sure to see time-of-flight camera appearing in more and more smartphones in the near future, and there are strong rumors that Apple will include one in the next iPhone.

Augmented reality was exciting when it first emerged, but upon trying it out it was hard to escape the feeling that the technology failed to live up to the hype. Time-of-flight cameras, coupled with improved processing power and higher speed connectivity, could be about to bring that original vision to life.

Topics
Simon Hill
Former Digital Trends Contributor
Simon Hill is an experienced technology journalist and editor who loves all things tech. He is currently the Associate Mobile…
Google Photos is getting a cool new feature to speed up your photo edits
Google Photos' year in review feature for 2024.

Google Photos for Android is introducing a new feature that simplifies photo editing right before sharing. A tipster from Android Authority first reported this tool.

The new “Quick Edit” tool lets users easily enhance or crop individual photos before sharing them. It features an “Enhance” button, which functions similarly to the “Enhance” effect in the standard photo-editing options. A crop button is also similar to the one in the regular photo editor. When multiple photos are selected before hitting the share button, the typical share sheet appears instead of the new “Quick Edit” screen.

Read more
The base model Galaxy S25 will get a RAM upgrade we’ve waited years for
Someone holding the Samsung Galaxy S24 with the display turned on.

Back in November, we heard rumors that the Samsung Galaxy S25 might come with an upgraded amount of RAM compared to the base Galaxy S24. The Galaxy S24 Plus and S24 Ultra both start with 12GB of RAM minimum, but until now, the majority of base-model Samsung handsets only had 8GB.

Abhishek Yadav, a known leaker, shared a post on X that said the base storage variant of the Galaxy S25 would come with 12GB of RAM. This also implies that the base storage is likely to be 256GB too. As apps, operating systems, and integrated AI become more powerful, so do their technical requirements. A bump to the base amount of RAM and storage will yield improved performance (hopefully) without a significant cost increase.

Read more
It just got a lot easier to see what pictures are in your Google Photos albums
New Albums section in file details in Google Photos.

Google Photos is one of those apps that seems to constantly get new features. The updates aren't always game-changing, but they're much appreciated nonetheless. Google Photos is getting another such update in the form of a new "Albums" section that will be available when viewing a photo or video.

When looking at a photo/video in Google Photos, swipe up to view the details. Above the "Location" section, you should now see a new "Albums" area indicating which album that file is a part of.  You'll see the album name and how many items are in it. You can also tap the album to be taken straight to it.

Read more