As if you didn’t have enough phone specs to consider when choosing a new phone, here comes another one: Smartphone cameras are now being augmented with what’s called a time-of-flight or TOF sensor, as well as the standard wide-angle and telephoto lenses you might be familiar with. Even Apple might join in next year. If you want to know whether this is an essential feature or an unnecessary luxury for your next handset, read on.
While a TOF sensor might get lumped in with the other camera lenses in a phone spec listing (“triple-lens” or “quad-lens” for example), it’s not the same as the other lenses. Its primary job is to capture depth, not light.
These sensors ping out beams of light like radar, measuring the time it takes for the light to return to judge how far away objects are, and to map out a scene in three dimensions. In terms of functionality, it’s like the TrueDepth camera on the front of the latest iPhones, only capable of mapping more points over a bigger area.
The resulting 3D map is known as a range image, and thanks to the speed of light—299,792,458 meters or 983,571,056 feet per second, at the last count—the capture still happens in an instant, with typical maximum frame rates of 60 frames-per-second. Everything happens on the infrared light spectrum, which is why you won’t see a sudden burst of light when these sensors are engaged.
Part of the reason TOF sensors are making their way into phones is that they can achieve a good range, a high accuracy, and fast readings from a component that doesn’t have to be very big at all, and which doesn’t draw much power—the perfect mix if you’re wanting to stick something inside a phone.
To top it off, the technology is becoming affordable enough to build into phones without a huge markup on the final cost of the handset, which is why it’s been appearing in a lot of phones across a range of prices in the last year or so.
In terms of the basics, time-of-flight isn’t all that different from the Lidar technology you’ll see built into some self-driving cars, though in the case of autonomous vehicles, lasers are used to scan the nearby environment rather than infrared light.
You’ll find TOF sensors used in robotics, and in topography studies, and on drones, and there was one built into Microsoft’s second-gen Kinect sensor for the Xbox as well, for the same purpose—to measure the position of objects in a scene, and to work out where they are in 3D space in real-time.
Knowing how far away objects and people are can improve photo quality in various ways—everything in a scene can be brought into focus more accurately and more quickly, for a start. On top of that, effects like portrait or bokeh blur can be applied with greater effectiveness and flexibility.
Low light is particularly troublesome for smartphone cameras, and TOF can help here too, because it can measure the position and depth of objects even if they aren’t properly lit up. That can provide some useful extra information to a camera as it’s trying to process a nighttime shot.
Don’t forget augmented reality either (something Apple is particularly fond of). Knowing precisely how far away a coffee table or a plant is means that AR apps are able to create scenes that are more immersive and realistic because all the computer-generated parts of the picture are properly mapped in 3D space.
The TOF sensor isn’t working in isolation, either—onboard image processing techniques can be used to combine the information coming in from this sensor with the images captured by the other lenses that make up the camera. The end result is more detail and sharpness even if you’re not doing any fancy focus or AR effects.
Support for gestures is something else that a TOF sensor can add to a phone, a little like the Soli chip Google is adding to the Pixel 4 (though the underlying technology differs).
If you’ve decided that you absolutely must get yourself a phone with a TOF sensor attached on the back, you’ve got a few to pick from. The 5G version of the Samsung Galaxy S10 comes with a TOF sensor, as does the LG G8 ThinQ that was unveiled at Mobile World Congress earlier this year (where it’s used to help power the palm-reading technology that lets you unlock the handset, as well as improving camera shots).
When the Huawei Mate X eventually goes on sale, it’s going to come with a time-of-flight sensor on board too, and there’s also one in the camera array of the Huawei P30 Pro: it means “your portrait [shots] will be highlighted and the sharpness will be exuded to perfection” in the words of Huawei.
Whether TOF technology will make it into the next generation of Pixels and iPhones remains to be seen, but it’s worth looking at what manufacturers are actually doing with the extra data in the camera app—like every other spec on your smartphone, the implementation matters as much as the actual inclusion of the component.