What Is A ToF Camera and What Does It Do?

What Is A ToF Camera? And What Does It Do?

There is no doubt that smartphone cameras have evolved tremendously since the 2010s. And in an effort to take the capabilities of smartphone cameras to the next level and improve on their quality, smartphone manufacturers have started adding what is known as a ToF camera onto their smartphones.

But, what is it exactly?

Also known as a depth sensor, a ToF (time-of-flight) camera is a type of smartphone sensor that analyzes a scene and can determine the distance and depth of the objects in that environment. The phone can then take that data and use it to create a 3D map of the scene.

The 3D information derived from the ToF camera can be used to create photographic effects such as the bokeh effect (blurred background) and to create augmented reality effects in AR applications.

How does it all work? Let’s look at the ToF camera in a little more detail.

What is a ToF camera?

A ToF camera goes by many names. It is also referred to as ToF sensor, 3D sensor, depth camera, or even ToF 3D camera. It all depends on the wording the phone manufacturer chooses to refer to and market the technology on their devices.

A ToF camera is not a camera that can take pictures as other cameras on the phone do. Rather, it’s a tool that’s used to measure depth in a scene. ToF stands for time-of-flight, which is a reference to how the technology works.

The camera has a sensor that uses light to measure how far objects are from the camera and uses that information to map out the world in 3D.

The ToF sensor is usually a low-resolution camera in most smartphones. For example, the resolution of the ToF camera found on the Samsung Galaxy S10 5G is only 0.3MP. Some ToF cameras go up to as much as 2MP.

Although this may sound very low compared to the high megapixels on some mobile cameras, it’s acceptable for this type of sensor since it doesn’t capture photographs as other cameras do. It doesn’t require a high resolution to create a depth map.

ALSO READ: The REAL TRUTH About Smartphone Camera Megapixels

How does a ToF camera work?

Time-of-flight is a measurement of how long it takes a particle, wave, or object to travel a certain distance. In the case of a ToF camera, it uses light to measure distance. It is similar to how a bat uses the sound it makes to locate objects around it.

A smartphone time-of-flight sensor uses a form of LIDAR (Light Detection And Ranging) known as scannerless LIDAR. It’s a remote sensing method that fires an infrared light pulse to measure distance.

Each photosite on the ToF camera’s sensor is responsible for measuring the time it takes the infrared light fired from the camera to reflect off the objects in front of it and return to the camera.

illustration of how time of flight works

This is all done very quickly. One pulse is recorded at the speed of light, which is approximately 300 000km/s. So, a ToF camera calculates depth using the following formula:

Distance = (speed of light x time-of-flight) / 2

Once the infrared light is fired, it bounces off whatever is in front of it, and then returns to the sensor.

Infrared beams that bounce back from nearby objects will return to the sensor quicker than those that bounce back from distant objects. Therefore, the time-of-flight to nearby objects will be shorter than the time-of-flight to objects further away.

example of time of flight 3d depth map from normal image

ToF 3D depth map. Source: ubergizmo.com

And because the speed of light is constant, the ToF camera uses this information to calculate the distance and map out a 3D depth map.

Imagine a bird flew at the same speed every time you released it from its cage. You would be able to tell how far it went if it flew to one location and back immediately by measuring the time of its flight. That’s basically how a ToF camera uses the speed of light to measure distance.

What's the use of a ToF camera on a phone?

There are a couple of reasons why you’d find a time-of-flight camera on a smartphone, depending on where it is placed.

Because ToF cameras are so excellent at 3D mapping a scene, they work well for facial recognition. So, if the ToF camera is on the front side of the phone, it’s most likely for that. This also works well for augmented reality selfie filters.

A ToF camera at the rear of the phone can be for a number of reasons such as AR applications or for photography and videography. Since the introduction of ToF cameras, smartphones have been able to produce stunning images with better looking shallow depth-of-field.

For example, when capturing a portrait of someone with your phone, data from the ToF camera can be used to separate your subject from the background. What you end up with is a photo that has your subject fully in focus but the background blurred out to emphasize your subject more.

In the case of videos, you can blur out the background in real-time. You can even switch the focus between the background and the foreground while you shoot your video. Without the depth data from the ToF camera, that would be harder to achieve.

What else can a ToF camera do?

Besides its role in computational smartphone photography, a ToF camera can do a variety of other things. For one, it can be used for gesture control. Yes, poor attempts at this have been made before. But considering that ToF cameras work at the speed of light, it’s only bound to get better.

As mentioned before, a ToF camera is also ideal for augmented reality applications. Apps that use real-time 3D filters such as Snapchat, and AR-based games such as Pokemon Go, can only get better with the implementation of a ToF camera. Virtual reality worlds can also be created much quicker with the help of ToF technology.

Although ToF cameras are limited in function on smartphones, their real-world applications are almost limitless. As they become cheaper and gain popularity, ToF cameras could soon be everywhere.

Advantages of a ToF camera

Compared to other distancing and depth-sensing methods on smartphones, time-of-flight is pretty accurate and fast. It can map out a 3D image of a scene very quickly and in one shot. Also, things like humidity, temperature and air pressure do not interfere with a ToF sensor, so it can be used indoors and outdoors.

Even though they fire lasers at objects in front of them, ToF cameras are not harmful to your eyes. They use low-power infrared lights that are driven by modulated pulses. But, low-powered as the laser may be, it covers a decent distance.

A ToF camera is also cheaper than other types of 3D mapping technology, making it perfect for smartphone makers who don’t want to hike up the cost of their devices due to expensive camera imaging technology.

ToF camera limitations

As great as a time-of-flight camera is, it does have its limitations. If there’s too much ambient light reaching the ToF sensor, such as light from the sun, it can cause the sensor to struggle to detect the light reflected off the objects in front of it.

Not only that, but reflective surfaces and shiny objects can confuse the sensor because the light will get scattered and create many unwanted reflections, instead of the one light reflection required to measure distance.

Despite these limitations, ToF cameras are a quick and reliable way to gather 3D information.

ToF vs Stereo vision

Before the introduction of time-of-flight cameras, smartphones used dual camera stereo vision to see depth. One camera would be a wide-angle camera, and the other a telephoto camera or ultrawide-angle camera.

The advantage of dual-camera depth sensing is that you at least had two functional RGB cameras that could take photos with different focal lengths. The downside is that it’s a rather power-hungry system that’s not so accurate because of its inherent limitations and imperfections.

An example of the struggles of using stereo vision to create depth can be seen when taking portrait photos. When the blurry background effect is created using stereo vision, it sometimes results in some areas being blurred and others not, or the blur spilling over into areas that should not be blurred.

A ToF camera, on the other hand, is quite precise because it doesn’t rely on data from two cameras with different focal lengths. ToF cameras rely on the quickest and most accurate form of measurement-- the speed of light.

As a result, the blurry background effect in portrait photos can be more accurate and less likely to spill over into areas that shouldn’t be blurred. Not only that but also objects that are further in the background can be blurred out more than those closer to the foreground, thus adding more realism to the effect.


ToF and LIDAR are essentially the same thing. They both use infrared light to calculate the depth of the room. However, a ToF camera sends out a single laser pulse to get a reading of the depth of an environment, whereas a LIDAR scanner sends out multiple pulses to get a more accurate reading of the scene.

LIDAR is used in driverless cars, where an accurate reading of the environment is of utmost importance. However, this technology has also crossed over to smartphones. Some manufacturers have opted to use LIDAR scanners in their devices for better photo and video performance, as well as out of this world augmented reality.


Time-of-flight cameras may be a new addition to smartphones but they show promise of taking mobile photography even further. They’re adding to the computational photography capabilities of smartphone cameras, which only seem to get better and better every day.

But even though having a ToF camera on your phone can add something extra to the look of your images, the onus is on you to learn how to get the most of your smartphone camera so that you can know how to take good photos with your phone. Who knows, you could even end up making some money selling your mobile photos online.

Comments 4

    1. Post
  1. Thank you very much bro. It s really helpful. Very good information explained neatly and easily understandable by amateurs like me.

    1. Post

Leave a Reply