Photogrammetry - Model Orientation

Hi,

I've been playing around with Object Capture API and am impressed by the results.

However, one thing that I've been struggling with is the orientation of the output model - I often find that the model is the wrong way around, and requires manual rotation to correct.

I've been capturing in portrait and the images have Orientation EXIF metadata. I've tried converting the images to PNG, which essentially removes the orientation metadata and stores the pixels the "right way around". Sometimes this produces better results, but not always.

I've also tried feeding in gravity vectors via PhotogrammetrySample. However, this seems to have no effect.

Does the Object Capture API use gravity vectors to determine the output model orientation? Likewise, does it use Orientation EXIF metadata?

Thanks

(Note: I'm currently using MacOS 12.5)

Replies

I have been battling with the same issues. I have tried feeding a folder to the PhotogrammetrySession and also feeding it a sequence of PhotogrammetrySamples. Either way I always seem to get a resulting 3D model that is proportioned correctly, but is oriented on a skew angle or sometimes even upside down. Supplying gravity vectors seems to have zero effect on the result, so I'm not sure if they simply are unused or if they aren't used to orient the model. Apple's sample app seems to magically embed the gravity vector inside the HEIC file's metadata, as a result of the way the image itself is recorded using AVFoundation. But I'm recording the images inside an ARKit session, and the HEIC snapshots provided by ARKit don't include that EXIF gravity vector.

Did you find any solutions to this problem?