I updated to 15.4.1 the system version of iPadPro12.9 4th (Model Number: MXAU2J/A) . And we have built & executed the following projects.
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/capturing_depth_using_the_lidar_camera
However, there is an exception threw "lidarDeviceUnavailable" and an execution error. (There is place where exceptions throw.) (As a supplement, depth measurement using ARKIT can be performed.)
// Look up the LiDAR camera.
guard let device = AVCaptureDevice.default(.builtInLiDARDepthCamera, for: .video, position: .back) else {
throw ConfigurationError.lidarDeviceUnavailable
}
I also own iPhone13Pro(iOS 15.4.1), and I was able to execute sample code without any problems.
Is the cause of this problem for hardware constraints? Or is it a bug?
In the project I join, iPad Pro 11-inch (2nd Generation) is developed as a target terminal.
Can I run a sample code with iPad Pro 11-inch (2nd Generation)?
I was waiting for the depth measurement by AVFoundation using LIDAR sensors.
The reason is that I can measure depth with ARKIT, but it was not possible to change the camera exposure.
Thanks.
Post
Replies
Boosts
Views
Activity
If I implement and check, you will not see one pixel and see that the image is smoothed and displayed.
Is there a way to view a pixel?
Is there a property to turn off smoothing to UIImage?
The following is the verification video.
https://streamable.com/qy5bzs