Can the RealityKit 2's Object Capture API be used to create a model of a building's interior?
I've only found examples of creating models using pictures taken by walking around an object, not from inside of it 🤔 I know photogrammetry in general can be used for such cases, but I'm not sure the new RealityKit's API supports it.
I'd be grateful if someone who tried it shared their results (I can't right now) and if someone at Apple could confirm whether this should be supported or not by design.
Thank you for your time 🙇♂️
Definitely you can. The whole city blocks and cathedrals are captured and reconstructed in 3D by photogrammetry software. That's how we have the whole real cities in Ubisoft games (i.e.). Archeologists and historians use it too. But you need tons of images (100 is very optimistically). 400+ and more. You might capture video of building form drone or satellite if you have one. Then convert video into image sequence and feed it into the photogrammetry. There are few very expensive software for it. But now we have native photogrammetry engine with full power of Apple's ML and metal.
It seems more logical to capture buildings by LiDAR in AR Kit. Just like in "3D Scanner" app and "Capture". But I doubt you're brave enough to attach your $1000 iPhone Pro MAX on drone. it's not possible to handle such huge amount of data in one AR session I guess. That's why scientists/cg artists use LIDARs to scan interiors and photogrammetry to reconstruct exterior landscapes.