The new generation of the iPhone is already a reality. The photographic improvements introduced in this new generation go beyond the renewal of the sensors. For the first time in the history of mobile phones, Apple has introduced a LiDAR scanner in two of its mobile phones, the iPhone 12 Pro and the iPhone 12 Pro Max. The technology in question was already released with the latest iPad Pro launched by the company a few months ago. But what does this development really stand for? Will it help improve the results of images captured with the rear cameras? Let’s look at it.
What is LiDAR Scanner and How Does it Work?
In purely linguistic terms, LiDAR is an abbreviation for ‘Laser Imaging Detection and Ranging System’. Although Apple has marketed this technology as ‘innovative’, the truth is that LiDAR systems have been present in the market for several years, although its application in mobile technology has not been effective until today.
Specifically, this type of system is used especially in tasks where it is necessary to know the orography of a certain surface, either to know the relief of a geographical area or to obtain a three-dimensional map of an object to be analyzed or modified in CAD design programs. It is also the system used in speed radars to calculate the speed of a certain vehicle at a particular point.
As for the operation of LiDAR systems, the mechanics used are quite similar to that of a conventional camera. Through a series of laser beams, the sensor can capture the volumetry of objects and the environment in general. Thanks to this, the sensors can generate maps in three dimensions in real time without depending on other complementary sensors.
What Application Does it Have in Mobile Photography?
During the launch of the new generation of iPad Pro in the middle of this year, Apple emphasized the possibilities of LiDAR sensors in the field of augmented reality. In fact, the possibilities offered by the sensor in Apple’s tablet are related to the use of third-party applications. And the truth is that today there are already applications that allow us to analyze the environment and convert it into a three-dimensional map. Unfortunately, the number of existing tools in the Apple application store is still quite low.
If we talk about the possibilities of the sensor on an iPhone, the tone does not differ much from the sensor on the iPad. Unlike the latter, Apple has put the focus on mobile photography, and more specifically, on computer photography.
One of the great improvements that the new generation brings to the camera of the iPhone 12 Pro and 12 Pro Max has to do with optimizing the Portrait mode in night scenes. This improvement is directly related to the incorporation of the LiDAR sensor as a supported sensor in photographs where light is low. What this sensor does is calculate the distance and volume of people in dark environments to improve the final image result when separating the body from the people and objects from the rest of the environment.
Apple has also confirmed that the new iPhone is capable of focusing up to 6 times faster than the rest of the iPhone, so the improvements also apply to body focus at night. It is at this point where the algorithms of computer photography developed by the Americans (Deep Fusion and Neural Engine) enter.
The last new feature pointed out by Apple has to do with the performance of the Night mode, which in addition to obtaining better results, is compatible with the sensor with a wide-angle lens. Although the company does not confirm it, everything points to the LiDAR sensor that could intervene in the taking of photographs in these two scenarios.
This post may contain affiliate links, which means that I may receive a commission if you make a purchase using these links. As an Amazon Associate, I earn from qualifying purchases.