I have an immersive space with a RealityKit view which is running an ARKitSession to access main camera frames.
This frame is processed with custom computer vision algorithms (and deep learning models).
There is a 3D Entity in the RealityKit view which I'm trying to place in the world, but I want to debug my (2D) algorithms in an "attached" view (display images in windows).
How to I send/share data or variables between the views (and and spaces)?
Post
Replies
Boosts
Views
Activity
How to find main (left) camera transform from world anchor? (Enterprise API)
From CameraFrameProvider() I can get a frame sample which has an "extrinsics" parameter. How is it defined? Relative to what point/anchor?