Hi, I have an app based on Metal and it runs on VisionOS.
It has a huge sphere mesh and renders video outputs (from AVPlayer) on it.
What I want to do is rendering left portion of my video output on left eye, and right portion of my video output on right eye.
In my fragment shader, I think I need to know that the thread in shader is for left eye or right eye. (I'm not using MV-hevc encoded video but just hevc encoded one)
So, what I currently do is, I assume 'amplification_id' is for the thing which determines the side of eyes. But, I'm not sure this is correct.
Discover Metal for immersive apps
RSS for tagDiscuss the WWDC23 Session Discover Metal for immersive apps
Posts under wwdc2023-10089 tag
5 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
How do I create & open an immersive space window scene from a UIKit view or view controller? I need to create one in order to use Compositor Services in order to draw a 3D object using Metal, but this particular GUI is drawn & laid out using UIKit, and it isn't possible for me to rewrite it to use SwiftUI.
I already tried [UIApplication.sharedApplication activateSceneSessionForRequest:[UISceneSessionActivationRequest requestWithRole:UISceneSessionRoleImmersiveSpaceApplication] errorHandler:...], but all that happened was it opened a new window for the main application scene (UIWindowSceneSessionRoleApplication), instead of opening an immersive space scene as I expected.
Yes, I did create a scene manifest in my app's Info.plist, with a UIWindowSceneSessionRoleApplication scene, and a CPSceneSessionRoleImmersiveSpaceApplication scene. Surely there has to be a way to do this without resorting to SwiftUI...
The documentation is very sparse
Is there an example app?
Am writing a visual music app. It uses a custom SwiftUI menus over a UIKit (UIHostingController) canvas. The metal pipeline is custom.
A fully functional example app would be super helpful. Nothing fancy; an immersive hello triangle would be a great start.
Later, will need to port UITouches to draw and CMMotionManager to track pov within a Cubemap. Meanwhile, baby steps.
Thanks!
This is an spinoff from forum post xrOS+Metal example code here
Am porting Warren Moore's example here
Have the most of the conversion to Swift complete but am stuck at a couple places
ar_data_providers_create_with_data_providers(...)
cp_time_to_cf_time_interval(...)
Any hints on how to track these down? Thanks!