ARKit 3 & Person Segmentation

Has anyone been able to get this to work? I've been running Apple's own example code and it seems to fail as well. We're using an iPad mini 5th gen. Here is the code that we're working with:



    override func viewDidLoad() {
        super.viewDidLoad()
  
        guard let configuration = self.arView.session.configuration as? ARWorldTrackingConfiguration, ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentation) else {
            print("Failed to create config")
            return
        }
       
        configuration.frameSemantics.insert(.personSegmentation)
        self.arView.session.run(configuration)
       
       
       
   
    }
   
   
    @IBAction func userDidTapToPlaceGlobe(_ sender: Any) {
       
        let anchor = AnchorEntity(plane: .horizontal, minimumBounds: [0.15, 0.15])
        arView.scene.anchors.append(anchor)
       
        guard let url = Bundle.main.url(forResource: "earth", withExtension: "usdz") else { return }
       
        _ = ModelEntity.loadModelAsync(contentsOf: url)
            .sink(receiveValue: { (entity) in
               
                anchor.children.append(entity)
                entity.scale = [1,1,1] * 0.05
               
                entity.generateCollisionShapes(recursive: true)
                self.arView.installGestures([.translation, .scale, .rotation], for: entity)
            })
    }

Replies

UPDATE:


If I avoid RealityKit, I'm able to execute person segmentation without issue. So a simple ARSCNView with an object sitting on a planer suface makes this feature work without issue.