Hi,
just experienced using the Apple demo app for Truedepth images on different devices that there are significant differences in the provided data quality.
Data derived on iPhones before iPhone 13 lineup provide quite smooth surfaces - like you may know from one of the many different 3D scanner apps displaying data from the front facing Truedepth camera.
Data derived on e.g. iPhone13 Pro has now some kind of wavy overlaying structures and has - visually perceived - very low accuracy compared to older iPhones.
iPhone 12pro: data as pointcloud, object about 25cm from phone:
iPhone 13pro: data as pointcloud, same setup
Tried it out
on different iPhone 13 devices, same result, all running on latest iOS. Images
captured with the same code. Capturing by using some of the standard 3D scanner
apps for the Truedepth camera are providing similar lower quality images or
point clouds.
Is this due
to degraded hardware (smaller sized Truedepth camera) on new iPhone release or
a software issue from within iOS, e.g. driver part of the Truedepth camera?
Are there
any foreseen improvements or solutions already announced by Apple?
Best
Holger