Time-Of-Flight Sensors Will Level Up Your Smartphone Capability, What Is It?

Dhir Acharya


In addition to an already plentiful range of features, smartphone cameras now will come with a time-of-flight sensor. This may also feature in iPhones.

In addition to an already plentiful range of features, smartphone cameras now will come with a TOF (time-of-flight) sensor. This may also feature in Apple’s next year iPhones.

What is it?

Although this sensor will add another lump to your phone next to the existing cameras, it offers something different. Essentially, it captures depth, just depth.

TOF sensors shoot out light beams like radar, which measures how long it takes the light to return so that it can judge the distance from objects and map a 3D scene. Regarding functionality, this sensor works similarly to the TrueDepth camera placed on the iPhone’s front, but it can map more points and cover a larger area.

The 3D map is called a range image. With the light speed of 299,792,458 meters per second and a 60fps frame rate, the capture happens instantly. And as the entire process takes place on the infrared light spectrum, you will not see the light bursting suddenly when the sensors are engaged.

A time-of-flight sensor

TOF technology is coming to smartphones because they are able to get a good range, great accuracy, as well as fast readings from a component which doesn’t need to be big or power-consuming, just perfect for phones.

Eventually, this technology will become cheap enough to feature in the smartphone without driving up the cost, and it has indeed appeared in many phones in different price ranges in the past year.

Basically, TOF is not much different from the Lidar tech used in autonomous cars for scanning the environment nearby.

Now, you will find time-of-flight in robotics, drones as well as studies on topography, and Microsoft used to apply this tech in its Xbox’s 2nd Gen Kinect sensor. At the time, TOF was responsible for measuring objects’ position in the scene to figure out their positions in real-time 3D space.

Why is it important?

Photography

Knowing the distance from people and objects to the phone can raise the quality of photos in a number of ways. Firstly, the phone can focus on the scene more quickly and accurately. Moreover, the flexibility and effectiveness of effects like bokeh and portrait will also be improved.

TOF can also help with night shots as it’s able to measure the depth and position of objects even when they are not lit up properly, providing useful extra info to a camera while it’s processing a nighttime shot.

Augmented reality

Time-of-flight sensors can also power AR

By knowing the precise distance from a plant or a coffee table, AR apps can generate more realistic and immersive scenes as all the parts of the picture are mapped properly in three-dimensional space.

And TOF sensors don’t work in isolation. With the current image processing techniques, we can combine the info fed by these sensors with images taken by the other lenses to generate a photo with more sharpness and detail even without any AR effects or fancy focus.

Another application of TOF sensors is gesture support. It works a bit like Google’s Soli chip in the Pixel 4.

When will phones with TOF sensors come to our hands?

Besides the Galaxy S10 5G, LG G8 ThinQ also features a TOF sensor

There are not many options to choose for a phone with a TOF sensor. Some of the choices include Samsung Galaxy S10 5G and LG G8 ThinQ.

When the Huawei Mate X is available for sale, it will house a TOF sensor too, while the P30 Pro has already housed one within its camera array. This means portraits will get highlighted while the sharpness will be perfected, according to the phone maker.

It’s unknown if TOF technology will come to the next generation of iPhones and Pixel phones, but it’s exciting to see what phone makers can do with more data in the camera app.

Next Story