O depth sensor is responsible for determining the distance between the smartphone cameras and the person or object to be photographed. The technology allows the precise recording of images with blurred background (bokeh effect).

How does the mobile depth sensor work?
The depth sensor helps the phone’s main camera define the approximate distance to the subject (person, animal or object in the frame). The technology follows the principle of human binocular vision, which relies on information captured by both eyes to generate depth perception.
The main camera can be combined with the depth sensor to create a shallow depth of field. Thus, the cell phone manages to leave the subject in the foreground completely clear while the background of the scene is blurred.
This technique is mainly used when creating photos with portrait mode (bokeh effect), which highlights the subject leaving what is within its sharp outline and keeping the background out of focus.

Different from the image sensor
While the depth sensor measures the subject’s distance from the camera, the image sensor is the component that converts the light captured by the lens into a digital image.
How to use depth camera?
Cell phones that have a depth sensor can take pictures with a blurred background by activating portrait mode in the camera app. The feature works with both the front and rear cameras. Some devices allow you to adjust the blur level even after the photo is registered.
Is the depth sensor still necessary?
Depth sensors are less and less used because the background blur effect can be done via software with satisfactory results.
Motorola, Samsung and Xiaomi are among the brands that have launched several smartphones with a depth sensor, mainly in the mid-range cell phone category. However, all of them are failing to launch models with a depth sensor.
DIGITALTREND tested the Galaxy A54, which has three rear cameras, but none for depth. Background blurring is done via software.

In some cell phones, the role of depth sensor can be performed by another camera, such as the one with an ultrawide lens, or by other sensors.
One example comes from the iPhone’s TrueDepth technology, which uses front sensors to help the front camera with facial recognition or recording selfies. Another is the Motorola Edge 40, launched with a dual camera, with the secondary working for macro, ultrawide and depth.
Despite the trend of disuse, the depth sensor can still be useful in entry-level cell phones, whose hardware cannot generate a convincing background blur effect.
On these devices, a simple 2-megapixel depth sensor may be sufficient for portrait mode, although it’s wise not to expect amazing results.

Some high-end cell phones use a Time of Flight (ToF) sensor to measure the distance to the subject. For this, the technology emits an infrared beam via laser or LED and calculates the time that this light takes to be reflected back to the camera.
This approach allows the ToF sensor to take accurate measurements even in low-light environments. The technology also makes measurements over long distances, can be used in 3D mapping and does not suffer interference from factors such as humidity and temperature.
There is a type of ToF sensor called LiDAR that is even more advantageous for scanning the subject using several laser beams at the same time. In addition to being more accurate, the LiDAR sensor allows high resolution mapping, but is more expensive than conventional ToF sensors.

LiDAR systems are present among the different cameras of cell phones like the iPhone 14 Pro. In it, the sensor contributes to autofocus and augmented reality tasks.