I am trying to implement Hand Tracking in a visionOS application I am developing. Initially, I encountered errors such as "Cannot find type 'ARAnchor' in scope." Upon investigation, I realized the error arose because "import ARKit" is required, which was already in the code.
Digging a bit deeper, I discovered that ARKit is not compatible with the simulator, or at least that's what I believe. My ultimate understanding is: "ARKit requires the specific hardware of an actual iOS device to operate, as it relies on features like the device's camera and motion sensors."
Having reached this point, I noticed in the "HappyBeam" application documentation for visionOS there was a function "#if !targetEnvironment(simulator)". Upon researching, it turns out that this compilation conditional is used so that the application can compile in the simulator (without ARKit functionality) and on a real device.
After applying "#if !targetEnvironment(simulator)" it allowed me to compile, but now when I run the application in Simulator, I receive a message on the main window stating: "AR not supported in Simulator."
My current question is: Can I simulate Hand Tracking in any way without using ARKit, or is it essential to run the application on a physical device?
And if Hand Tracking can be simulated in "Simulator", how is it done? I haven't been able to find a way to do it.
Post
Replies
Boosts
Views
Activity
I am trying to load two immersive spaces and I am getting errors because of trying to have both open at the same time. Is the only way in this case to create multiple Volumetric Windows to display 3D models on the stage? Or is there a way to maintain two immersive spaces at once? As far as I know, there are only three ways to display 3D models: Windows, Volumetric Windows, and Immersive Spaces.