Not quite long, Apple announced the new generation iPad Pro. The tablet arrived full of news, but the addition that most surprised the public was the LiDAR-type scanner — an acronym for Light Detection and Ranging. The technology uses a system similar to the Time of Flight (ToF) sensor to scan environments and create 3D experiences with a depth of field in augmented reality (AR).
How it works
LiDAR works simply: photons of infrared light are emitted into the environment to calculate the distance between objects and understand the depth of the field of the place. By pointing the camera at the desired object, the sensor will receive light pulse responses at different time intervals and thus calculate the specific distance between the smartphone and the different elements in the image. With the calculations, the scanner can create a 3D map of the environment and apply it to augmented reality apps. The range limit is five meters.
The technology is not exclusive to Apple and is in use, for example, smart cars that do not need human drivers. The sensor is considered crucial for the advancement of this industry, being used to give “vision” to cars and record everything around them, such as cyclists, pedestrians, and other cars. In 2015, Apple itself was already testing LiDAR on intelligent cars. Of course, the version used in the iPad Pro is much smaller than the ones found in cars.
What is LiDAR for?
The new sensor is intended to strengthen Apple’s augmented reality market. The company has been producing options for the industry since 2017 when it announced the AR Kit platform and began developing apps with AR.
LiDAR is likely to be used more extensively in new apps released with the iPad Pro. For example, the Hot Lava game is one of the options announced with the tablet. Currently, the game is available at Steam, but the mobile version will be exclusive to the Apple Arcade service. In this Hot Lava version, the player scans an environment, such as a living room, and the game will generate a scenario with hot lava. The main mission is to run from the lava and climb on objects that are free of danger.
The sensor will also enhance the Measure application, making it easier to calculate dimensions. It will even automatically measure someone’s height and use a ruler. Another interesting use will be with the Shapr3d app, which scans environments in 3D models and still allows users to add new objects in this scan.
The Complete Anatomy application will also be updated. Users will be able to study and make measurements with a real human being. For example, using the movement of one arm to visualize and understand the muscle set of that region. Apple has also revealed the commercial use of LiDAR with the Ikea store chain application, where consumers will be able to choose home furnishings through augmented reality.
Like the ToF, Apple’s new sensor should also enhance the photos on the iPad Pro, as it can calculate the depth of field. Cupertino’s giant is expected to invest heavily in this area in the coming years. The technology should also be used in the Apple Glasses digital glasses, scheduled for 2023.
This post may contain affiliate links, which means that I may receive a commission if you make a purchase using these links. As an Amazon Associate, I earn from qualifying purchases.