The Silicon Trend Tech Bulletin

Icon Collap
Home / Electronics / LIDAR: The Coolest Technology on iPhone and iPad Pros Now

LIDAR: The Coolest Technology on iPhone and iPad Pros Now

Published Tue, Mar 01 2022 06:44 am
by The Silicon Trend




LIDAR: The Coolest Technology on iPhone and iPad Pros Now


What is LIDAR?

It stands for Light Detection and Ranging used to determine ranges by focusing on an object with a laser and estimating the time for the light reflected on returning to the receiver.


How Does It Work?

This pulsed laser records the time it takes at nano-second speeds for the signal to return to the source, enabling it to produce a 3D model with greater precision than a simple camera could process. 

LIDAR being a type of time-of-flight camera, some smartphones estimate depth with a single light pulse, whereas LIDAR-type smartphones send light pulse waves out through infrared dots and can measure each using sensor. These sensors create a field of points that map out distances and mesh the dimensions of space and its things. If you think you could see the light pulses with the naked eye, then the answer is no. However, you could see it with a night vision camera.


Is LIDAR for Face Identification?

LIDAR sensor is not for advancing Face ID; instead, it's used in AR applications. However, the LIDAR scanner features in iPhone and iPad Pro series improve the precision of distance and measurement. But if the question is whether LIDAR is similar to Apple's face ID-enabling TrueDepth camera, then the answer will be yes, but LIDAR has a more extended range.

The TrueDepth camera shoots out an array of IR lights, but only up to a few feet distance. The absolute LIDAR sensors on the Pro series work at a range of approx. 5 meters.


Other Tech Using LIDAR

LIDAR tech is emerging everywhere. It is leveraged in autonomous vehicles, drones and robotics. AR headsets such as the HoloLens 2 have similar technology used for mapping out the room spaces before layering 3D virtual objects into them.

Microsoft's old depth-sensing Xbox accessory - Kinect, had IR depth-scanning. In 2013, Apple obtained PrimeSense, the company that helped develop Kinect tech.


Also read: iPhone 13 on the Market, Carriers Set to Capture New Customers


How do Apple Apps Benefit?

Apps are all about offering accurate data, or in other words, it's all about accuracy. For instance, the Measure app calculates our height with precision. However, precise measurement of size wasn't possible earlier. But now, the app will have a new Ruler View in the recent update, where creators use LIDAR data to its best, benefiting their application by leveraging a new Scene Geometry API.


Also read: iPhone's Tap to Pay Feature Enables Crypto Payments Across Businesses 


Pros Work Better with LIDAR

Time-of-flight (ToF) cameras are used on smartphones; the same goes for the Pro series. The tech giant promises better low-light focus, up to 6x speed in low-light conditions. The LIDAR depth-sensing is also leveraged to enhance night portrait mode effects. Better focus is a plus, and there's an opportunity for iPhone 12 Pro to add more 3D photo data to images.

LIDAR also permits the Pro series to start AR apps quickly and develop a fast map of a room to add more details. Many of Apple's core AR tech uses LIDAR to hide virtual objects behind real ones and place virtual objects within more complex room mappings.