How to use scene with object anchor in Xcode?

With the latest few releases of Reality Composer beta (on iOS), the ability to create a scene using a 3D object as an "anchor" is now a possibility. I have created a scene using the object anchor choice, scanned my object in 3D within Reality Composer, and can successfully test this experience by viewing my Reality Composer project in AR, then choosing "Play," and seeing my 3D item appear when the object is detected.


For posterity, my scanned anchor is a bottle, and my 3D item is a metallic sphere.
When attemting to bring this experience to Xcode, I am unsure how to use the object as the anchor. I am loading like so;

let bottle = try! Bottle.loadScene()
arView.scene.anchors.append(bottle)

In this case, Bottle is my .rcproject and my scene is named "scene." When I build and run the project, my 3D item (the metallic sphere) appears immediately on screen, rather than remaining hidden until the "object anchor" (the bottle) is detected.
Using the Scanning and Detecting 3D Objects documentation as a guide, do I need to manually set up the ARWoldTracking reference objects, like so;

let configuration = ARWorldTrackingConfiguration()
guard let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "gallery", bundle: nil) else {
  fatalError("Missing expected asset catalog resources.")
}
configuration.detectionObjects = referenceObjects
sceneView.session.run(configuration)


And if so, how do I get access to the .arobject scanned from Reality Composer?
Thanks!

Replies

In Reality Composer, you simply select the Object Anchor, and then specify the .arobject right?

Did you ever figure this out?