I used my iPhone 11 with only dual rear camera no lidar to test this app, the results quality are vary. My output folder has depth map and I’m assume that those depth maps were generated by two camera(normal and wide). The output quality really depends on a lot of things….light, movement, lens, overlapping percentage, object texture, object size, surface, background etc. I think lidar will definitely improve the output quality, but not sure if it actually work with camera when taking pictures. If you using iPhone12 pro, like how do you know if the dual camera or lidar provides depth data?
Post
Replies
Boosts
Views
Activity
I used my iphone 11 which has dualcamera, and it's works pretty good, and i tried my m1 ipad pro, no depth captured. Basically, i think that the lidar does not work with this app at all.
I have a similar issue. If session recorded with location disabled, it works great on iphone or ipad. If location enabled, the error message popup and it truns to black screen on iphone or ipad.