image tracking in apple vision pro

ISSUE: In our code we are using the ImageTrackingProvider and ARKit similarly with the code provided from Apple documentation: https://developer.apple.com/documentation/visionos/tracking-images-in-3d-space However, when the application runs and we move the image in real space, the Image Tracking Provider send updates with a very low rate (about one frame per sec!) on the real Vision Pro device (please see the attached video).

the image anchors are updated as soon as they are available automatically by the system and they are not depended from camera frame rates. Therefore, why this is happening?

  • We tried also to create an ImageAnchor by using the Reality Composer Pro

in order to build a scene with it and check if we could have better tracking speed and updates. However, we found that Reality Composer Pro does not support image anchors like its predecessor Reality Composer!

  • We also created the ImageAnchor on a Reality Composer Project and

we tried to import the reality project / scene to out visionOS app. However, when the app builds we take an incompatibility message: “RealityKitContent - Tool terminated by signal 'Bus error: 10’ ” Other Reality Composer Projects that do not have image anchors are imported without any problems!

  • We also tried to find if there is a frame rate setting on the real

Vision Pro device (for reasons of battery saver), but we couldn’t find any.

  • Finally, we tried to change asynchronous Tasks to synchronous

in our code, but this couldn’t solve the problem.

  • As the image detection and tracking in our code runs perfectly on iOS

devices, and we want to build our apps to pure immersive space visionOS projects, what else can we do to have the same efficiency and performance like iOS?

It seems that for now ImageTracking is unusable on visionOS except that using it to anchor definitely an object somewhere. You cannot follow the image in realtime. It is even worse when you try to track multiple images. My guess is that Apple devs were very conservative about global performance and maybe the NPU is busy analysing the scene. However, we kind of already have the same amount of workload in iOS with "smaller" chips, so I pray to have nice announcements at the WWDC 24

Yes, I encountered the same issue. May I ask if the official team or your team has found a solution? I would greatly appreciate it.

image tracking in apple vision pro
 
 
Q