As far as I know there is no public information about support for unreal on visionOS.
Given the bad blood between the companies that may be true for a long time.
Post
Replies
Boosts
Views
Activity
My Solution
You can get the location of the tap gesture on an entity with:
let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene)
You can grab he orientation of the tapped plane with
let rotation = value.entity.orientation(relativeTo: nil)
I want to track this location so it can be persisted in the WorldTrackingProvider (whenever it works??? ) with a world anchor. I do so by making a world anchor with:
let pose = Pose3D(position: worldPosition, rotation: rotation)
let worldAnchor = WorldAnchor(originFromAnchorTransform: simd_float4x4(pose))
When you get the anchor from your world tracking listener you can apply it's transform (which contains position and orientation) to an entity like so:
entityForAnchor.transform = Transform(matrix: anchor.originFromAnchorTransform)
If you've got another way, or a way that has a more intuitive use of the API topology please share. This was cobbled together with a ton of trial and error.
Search of what? Are you talking about Spotlight support via voice?
This works in beta 3
Intel or Apple silicon?
Speculation:
Vision Pro allows a stationary user to interact with their application windows via keyboard+mouse/trackpad.
I suspect this setting lets the cursor function as a mouse/trackpad pointer instead of the user’s eyes In the VisionOS simulator.
Control center also offers a way to exit immersive scenes
My feedback says no recent similar reports. FB12639395.
Still not working in beta 4. Still haven't heard back on devkit application.
At the moment there is no workaround other than receive a dev-kit or purchase a device when it releases. I am in the same boat as you.
I don’t see code in your recent post that invokes the open window command.
However, without an entity to receive the tap, it may not be possible to do what you want.
An empty Immersive View doesn’t have anything to receive the tap. You may need to use the scene reconstruction provider to generate meshes of the world so those meshes can receive the taps. This is not available in the simulator at the moment.
You can construct an AnchorEntity targeting an image in the app's bundle:
https://developer.apple.com/documentation/realitykit/anchorentity/init(_:trackingmode:)
Correct, in betas 1-3 of VisionOS ARKit data provides like plane detectors and scene reconstruction do not work.
You can make an AnchorEntity of type plane and it will find a surface, though.
ImmersiveSpaces are not possible per Apple reply here: Link
Not possible per Apple reply here: Link