Scene understanding missing from visionOS simulator?

SceneReconstructionProvider.isSupported and PlaneDetectionProvider.isSupported both return false when running in the simulator (Xcode 15b2).

There is no mention of this in release notes. Seems that this makes any kind of AR apps that depend on scene understanding impossible to run in the sim.

For example, this code described in this article is not possible to run in simulator: https://developer.apple.com/documentation/visionos/incorporating-surroundings-in-an-immersive-experience

Am I missing something or is this really the current state of the sim?

Does this mean if we want to build mixed-immersion apps we need to wait to get access to Vision Pro hardware?

Replies

Filed bug report as FB12457682

  • Hi, Did they reply?

Add a Comment

If that is not supported in simulator, that will be very inconvenient to develop AR immersive apps before Vision Pro available. Hope apple fix this asap...

Still missing in Xcode 15b3(15A5195k)

Still missing in Beta5.. This has completely blocked development for our team.

  • Same. What is everyone doing for a workaround? I am thinking we could make some careful progress with iOS ARKit until we have a device, and then port.

  • Only work around I could think of was requesting a dev kit. iOS and VisionOS have enough differences with ARKit, that I'm not sure it's worth it for me to deal with that for a hobby project.

    If I don't get a dev kit, or the simulator doesn't get updated, I'm probably SOL for a few years.

Add a Comment

Also filed this. I've about run out of things I can develop and feel I need to begin coding interactions with planes and the persistence of world anchors. https://feedbackassistant.apple.com/feedback/12639395

Hope the simulator gets this and I don't have to wait several years to continue.

Still missing in Beta 6.

  • ..and beta 8 (15A5229m) that is shipped with visionOS 1.0 (21N5233f)

Add a Comment

Also found this, and reported it as FB.

As of Xcode 15 Beta 8, all existing ARSession DataProviders report as unsupported in the visionOS simulator. Filed this as FB13125675.

Post not yet marked as solved Up vote reply of atbf Down vote reply of atbf

It is logical that scene understanding is missing in the visionOS simulator. Scene understanding requires sensor data about the physical environment. The visionOS simulator without physical sensors cannot have sensor data.

Additionally, it should be noted that AVP includes cameras and LiDAR, but the sensor data is not shared with developers for privacy reasons. What is made available to developers is horizontal and vertical plane information and mesh information (ARMeshAnchor) that ARKit generates by internally processing sensor data.

  • The simulator provides an occlusion Mesh visualization when that debug option is enabled, so the data appears to be there but it is not plumbed to the data providers.

  • That's a big gap in their AR implementation. There are tons of usecases where you need more than just anchors on planes or meshes.

Add a Comment

Although what @JoonAhn says is correct, what the visionOS simulator needs to do is provide "simulated" plane information that matches the environments that are provided by Apple. This should not be too difficult, and would remove the requirement of having devs invest on expensive hardware in order to create apps for it.

  • Yes. Lets not make excuses for a multi trillion dollar company.

Add a Comment

its XCode Version 15.1 Beta(15C5028h), visionOS Simulator Version 15.1 (1018) and it's still the same. No DataProvider other than WorldTrackingProvider works in simulator. It's kinda funny that mesh data/camera position is already known to simulator and can populate ARKit sensor data very easily (with even more accuracy than real world) but who knows it might be too hard /s.

  • I checked the Xcode 15.2 Version 15.2 (15C500b), with the release version of visionOS - and it's still the same.

Add a Comment