Post

Replies

Boosts

Views

Activity

image tracking in apple vision pro
ISSUE: In our code we are using the ImageTrackingProvider and ARKit similarly with the code provided from Apple documentation: https://developer.apple.com/documentation/visionos/tracking-images-in-3d-space However, when the application runs and we move the image in real space, the Image Tracking Provider send updates with a very low rate (about one frame per sec!) on the real Vision Pro device (please see the attached video). According to WWDC2023 (https://developer.apple.com/videos/play/wwdc2023/10091) the image anchors are updated as soon as they are available automatically by the system and they are not depended from camera frame rates. Therefore, why this is happening? We tried also to create an ImageAnchor by using the Reality Composer Pro in order to build a scene with it and check if we could have better tracking speed and updates. However, we found that Reality Composer Pro does not support image anchors like its predecessor Reality Composer! We also created the ImageAnchor on a Reality Composer Project and we tried to import the reality project / scene to out visionOS app. However, when the app builds we take an incompatibility message: “RealityKitContent - Tool terminated by signal 'Bus error: 10’ ” Other Reality Composer Projects that do not have image anchors are imported without any problems! We also tried to find if there is a frame rate setting on the real Vision Pro device (for reasons of battery saver), but we couldn’t find any. Finally, we tried to change asynchronous Tasks to synchronous in our code, but this couldn’t solve the problem. As the image detection and tracking in our code runs perfectly on iOS devices, and we want to build our apps to pure immersive space visionOS projects, what else can we do to have the same efficiency and performance like iOS?
2
1
984
Apr ’24