The race to improve and differentiate smartphones has been focusing more and more on the camera suite and all that it offers. There’s a good chance that you’ve heard the recent buzz surrounding time-of-flight cameras, as manufacturers are starting to put them into phones. But what exactly is it?
Time of Flight
Time-of-flight (ToF) cameras are comprised of a sensor that uses a tiny laser to fire out infrared light. This light bounces off anything or anyone in front of the camera and back into the sensor. The length of time the light takes to bounce back is measured and that translates to distance information that can be used to create a depth map.
“Time-of-flight cameras are actually measuring the time it takes for light to go from the camera to the user or environment and be reflected back to the sensor,” Dr. Xavier Lafosse, commercial technology director of Corning Precision Glass Solutions, told Digital Trends.
Up until now, most phones have relied on stereovision, which employed two cameras to calculate rough depth, but this method doesn’t work in low light or in the dark and it’s not very accurate.
A better method that also employs infrared is structured light illumination where a dot pattern is projected onto a scene or face and the sensor measures the distance between the dots and looks at the distortion in the pattern to calculate depth. This technology works well in the short range, up to arm’s length, for things like facial recognition, which is why Apple employed it with its TrueDepth Camera for Face ID.
Time of flight works in a similar way, but it doesn’t use a pattern of dots. Because these methods rely on infrared light, they work well in low light and even dark environments. The time-of-flight camera illuminates the scene with a homogenous flood of light and the camera looks at every individual pixel in the image. The sensor synchronizes with an incredibly sensitive clock that’s capable of measuring tiny variations revealed by the speed of light bouncing back. With depth information assigned to every pixel you get a rich depth map.
“It’s the only method that’s really accurate at measuring distance.”
“It’s the only method that’s really accurate at measuring distance,” Dr. Lafosse said. “It’s the only one that’s not an interpolation or a calculation, but rather a measure of distance.”
There are various different potential applications for accurate depth mapping like this, which is why it’s creeping into more phones. You’ll find a time-of-flight camera in the LG G8 ThinQ and in the Honor View 20, to give two recent examples, but the implementation is different.
LG has paired the time-of-flight sensor with its 8-megapixel front-facing camera to create what it’s calling the Z Camera — the Z-axis denotes depth for 3D images. This enables facial unlock and something called Hand ID, another secure biometric, which reads the vein patterns in your hand. It’s also used for Air Motion gestures, allowing you to wave your hand over your G8 ThinQ to trigger various actions such as playing and pausing music without touching the device.
In the Honor View 20 the time-of-flight camera is paired with a 48-megapixel sensor as part of the main, rear-facing camera. It lends depth information that enhances portrait mode, creating a really accurate bokeh effect with the subject in sharp relief and the background blurred. But that’s not all it can do as part of a phone’s main camera.
“The value of time of flight is really about the midrange to long range distances; think about applications like augmented reality,” Dr. Lafosse said. “If the time-of-flight camera is on the back, then you know it’s not about facial recognition, but about sensing the environment and what you have in front of you.”
You may have tried out some augmented reality apps and games in the past, but time-of-flight cameras can dramatically boost the accuracy and fuse your actual environment with gameplay and characters for a whole new level of experience. That may be shooting zombies in your hallway or seeing how a piece of furniture you’re thinking about buying would really look in your living room.
There’s potential for deeper social interactions as well. Instead of FaceTime with Animojis, you might have a more fully realized 3D experience.
“Your friend’s avatar isn’t floating in the air, but actually sitting on your couch next to you.”
“Time-of-flight sensors can map your environment accurately, so your friend’s avatar isn’t floating in the air, but actually sitting on your couch next to you,” Dr. Lafosse said.
The underlying technology isn’t new. There was a time-of-flight camera in Microsoft’s Kinect sensor and the military has been using time-of-flight technology to get depth information for many, many years. But Dr. Lafosse said that improvements in the technology have allowed integration of the required elements into ever smaller form factors, and new applications for it are driving its adoption in phones.
This technology is also vital for augmented reality or mixed reality wearables — like Microsoft’s HoloLens or Magic Leap — to work because these systems need a very accurate picture of your environment.
Another area where time-of-flight cameras could help is indoor navigation. If there’s a 3D map of your building in the cloud, then the sensor could potentially recognize precisely where you are at any given moment.
Why does Corning know all this? The company makes the glass that protects most smartphones, and it’s also working on the optical components of time-of-flight cameras — making them smaller, more transparent, and ensuring they perform as well as possible — while manufacturers like Sony continue to improve the sensors, making them smaller and more power efficient. We are sure to see time-of-flight camera appearing in more and more smartphones in the near future, and there are strong rumors that Apple will include one in the next iPhone.
Augmented reality was exciting when it first emerged, but upon trying it out it was hard to escape the feeling that the technology failed to live up to the hype. Time-of-flight cameras, coupled with improved processing power and higher speed connectivity, could be about to bring that original vision to life.