telephoto, periscope It is ultrawide are some types of lenses used in cell phone cameras, while ToF It is To lead are sensors used to map depth. Understand, below, when to use each lens and how the type of sensor influences the result of photos, biometrics and other smartphone features.
lenses vs sensors
Smartphone cameras are popularly classified according to the type of lens or sensor. In general, we can find the following specifications:
- Wide angle lens: This is a wide field of view lens with a focal length generally between 18mm and 35mm, which is ideal for landscape, architecture, and other everyday photography. It is usually the “main lens” of cell phones and used in conjunction with the highest quality image sensor on the device;
- ultrawide lens: has an even wider field of view than wide angle, capturing even more elements in the photo. It’s ideal for panoramic images or shooting in tight spaces where it’s difficult to get away from the scene. It has a focal length generally less than 18 mm and may have distortions at the edges, typical of fisheye lenses;
- macro lens: designed to take extremely detailed pictures of very close objects such as insects, flowers and water droplets;
- Telephoto (or telephoto) lens: captures distant objects, offering optical zoom capability without significant loss of quality;
- periscope lens: uses a system of mirrors parallel to the body of the device to reflect light to the sensor, allowing greater optical zoom than simple telephoto lenses without increasing the thickness of the cell phone;
- monochrome sensor: records images only in black and white, but tends to capture more light than color sensors due to the absence of the color filter, which improves the contrast and sharpness of the image, especially in low light conditions;
- depth sensor: calculates the distance between the camera and different objects in the scene, improving the accuracy of the bokeh effect and face detection in portrait mode photos;
- ToF sensor (Time-of-Flight): is a depth sensor that uses infrared light to create a more accurate depth map of the scene, improving portraits and allowing augmented reality applications;
- LiDAR Scanner: more accurate than ToF, it measures the distance between the camera and the subject through pulses of light, and can be used in augmented reality and low-light photography applications.
Focal length and depth of field
Three basic concepts are needed to understand how a lens works: focal length, depth of field and field of view.
Telephoto lenses deliver a longer focal length than wide-angle lenses, which results in a shallower depth of field. That is, telephoto lenses see a more distant point, but have a more restricted focus area, which makes it easier to blur the background of the image.
Therefore, the field of view of the lenses varies as shown in the illustration below:
Thus, telephoto lenses are best suited for scenes where you need to capture details of more distant objects or isolate a subject in an image with a blurred background. Wide angle and ultrawide work best in architecture, large landscapes or when you want to include many elements in the same photo.
The magnification ratio determines the ability to capture fine details with a macro lens. For example, a 1:1 (1x) magnification means that the size of the object projected by the lens onto the image sensor is the same as its size in real life. A 2:1 (2x) macro lens reproduces objects twice their true size.
Macro lenses can be normal, wide-angle, ultrawide, or telephoto. What defines a lens as “macro” is its ability to focus very close to the subject, resulting in an image with tiny details that would not be clear in a normal lens.
More advanced cell phones can take macro photos, even with magnification greater than 1:1, even without a dedicated macro lens. This is done through computational photography and the use of other lenses. The iPhone 14 Pro, for example, takes macro photos with ultrawide, while the Galaxy S23 Ultra uses telephoto to achieve the effect.
In general, basic cell phones with macro cameras have sensors with lower resolution, that is, with less megapixels than the main camera. Therefore, even if the macro lens is able to focus at a closer distance to the object, the level of detail may be lower due to the limitations of the photographic set.
ToF and LiDAR are examples of depth sensors. What sets them apart from ordinary sensors is their ability to map the depth of a 3D scene, sending beams of light and measuring the time it takes for that light to return to the sensor.
Thus, the most accurate type of sensor is LiDAR. It is even used in self-driving cars and, on cell phones, it is indicated for augmented reality applications, night photography or portrait mode. However, its higher cost means that it is not found in many smartphones.
The ToF sensor is more used in features such as facial recognition or focus detection. While not as accurate as LiDAR, ToF is a more cost-effective option and still provides sufficient depth mapping for common applications.
Common depth sensors have become popular in cheaper smartphones, but have been set aside by manufacturers like Samsung and Apple, who have opted for more advanced and accurate technologies, such as ToF and LiDAR.
What is the advantage of the quad camera?
A phone with more cameras gives you more flexibility when taking pictures. For example, a quad camera might have a wide-angle lens for general use, a telephoto lens for distant subjects, an ultrawide lens for panning, and a ToF or LiDAR sensor. Availability of lenses and sensors will depend on the manufacturer.
How to use cell phone cameras at the same time?
To use the front and rear camera at the same time, you can use Instagram’s Dual mode, install an app like Doubletake on the iPhone, or activate a specific mode on the Android camera, like Director’s View on Samsung or Dual Capture on Motorola.