Mapping Real Life Object

Hello everyone!

I'm an architecture student posting here for the first time, so I apologize in advance for any rookie mistakes.

For a semester project, I designed an empty cube which is supposed to serve as a vessel for AR experiences. The idea is users would be able to discover all sorts of exhibitions and works of art (I partnered up with a local museum).

I managed to get my 3D CAD models converted to USDZ and build demos using Reality Composer. Now I professors have asked me to build the actual thing and I am faced with a challange… the cube is 4x4x4 meters in size and I would like for it to be possible for the viewer to step inside the cube, walk around, and experience whatever they're looking at as if it were actually standing in the cube – without the phone loosing track of the AR object. I have figured out how to map objects to a single image or some type of marker, but what about multiple markers? Is it possible, for instance, if I color each of the corners of the cube with different colors to make the software understand this real life object and thus scale and rotate the model accordingly?

Many thanks in advance! I included a render from my pitch, as they say a picture says more than a thousand words haha…

Is your cube virtual, or real (like this one, at Boughton House)?

Have you looked at "Tracking Geographic Locations in AR"?
https://developer.apple.com/documentation/arkit/content_anchors/tracking_geographic_locations_in_ar

Have you made any progress on this, @tuskuno

Oh time flies!

Yes I actually have, you can scan objects with the developer-only ARKit Scanner app, then set them as anchors for any given scene.

The cube is real, it is a physical object.

Mapping Real Life Object
 
 
Q