What kind of lidar cameras does apple use

Hi,

I am looking for tech details on the LiDAR camera that the iPhone 12 uses. I looked online and found nothing.

An approach to gain some information was to print the ARFrame's current stats, e.g: arframe.camera , arframe.capturedImage , etc ..

Are there any other things I can do. Thank you (:

I don't think they publish detailed technical information.

I found this article (in french but you may ask Safari to translate):

h t t p s : / / w w w.cnetfrance.fr/produits/iphone-12-tout-savoir-sur-le-lidar-une-technologie-prometteuse-39911969.htm

Lidar is a type of time-of-flight sensor. Other smartphones measure depth with a single light pulse, while a smartphone equipped with lidar sends light pulses in the form of an infrared point cloud and can measure each of them. The result is a field of light points that trace distances and can “mesh” the dimensions of a space and the objects in it with great precision. The light pulses are invisible to the human eye, but you could see them with a night vision camera.

What kind of information are you looking for?

The ToF (time of flight) 3D-camera of iPad Pro 2020/2021, iPhone Pro 12/13 and iPhone Pro Max 12/13 has physically 64 (16 stacks of rod of 4 cells) VCSELs (vertical cavity surface emitting laser). The 64 laser pulses are multiplied with 3x3 to 576 by a DOE (diffraction optical element). The 576 rebounded laser pulses from object surfaces are detected and the individual time elapses are measured by SPAD (single-photon avalanche diode) image sensor. The 576 depth points are interpolated with RGB images to the 256x192 depthMap of 60 Hz. Apple has released the access API to the 256x192 depthMap but not to the 576 depth points.

Please search for the article "Apple LIDAR Demystified: SPAD, VCSEL, and Fusion…"

Patent Application Publication US 2020/0158831 A1

The patent describes in detail the signal processing logic underneath the SPAD image sensor.

What kind of lidar cameras does apple use
 
 
Q