Discuss Spatial Computing on Apple Platforms.

Post

Replies

Boosts

Views

Activity

How to display a RealityKit Perspective Camera View in a visionOS SwiftUI 2D window?
I am developing an immersive visionOS app based on RealityKit and SwiftUI. This app has ModelEntities that have a PerspectiveCamera entity as child. I want to display the camera view in a 2D window in visionOS. I am creating the camera, and add it to the entity with let cameraEntity = PerspectiveCamera() cameraEntity.camera.far = 10000 cameraEntity.camera.fieldOfViewInDegrees = 60 cameraEntity.camera.near = 0.01 entity.addChild(cameraEntity) My app is not AR. The immersive view is programmatically generated. In iOS, I could use an ARView with non AR camera mode. However, ARView is not available in visionOS. How can I show the camera view in a SwiftUI 2D window in the immersive space?
5
0
441
Aug ’24
AVCaptureMovieFileOutput DOES NOT support spatial video capture on iPhone 15 Pro of iOS 18.1?
Hi everyone, I am having trouble implementing spatial video recording into files by following the WWDC24 video: Build compelling spatial photo and video experiences. Specifically, the flag "isSpatialVideoCaptureSupported" of AVCaptureMovieFileOutput shows FALSE where the code is tested on both my physical iPhone 15 Pro (iOS 18.1) and the simulator (iOS 18.0). This is the code that I am running: let movieFileOutput = AVCaptureMovieFileOutput() print("movieCapture output isSpatialVideoCaptureSupported: \(movieFileOutput.isSpatialVideoCaptureSupported)") However, one of the formats of AVCaptureDevice shows a TRUE for the flag isSpatialVideoCaptureSupported. for format in currentDevice.formats { if format.isSpatialVideoCaptureSupported { print("isSpatialVideoCaptureSupported is true") break } } I am totally confused now, why DOES the camera device support spatial mode while the movieFileCapture DOES NOT? Can someone please help? Really appreciate it!! Here are my testing environment: iPhone 15 Pro iOS 18.1 (US version) Xcode 16.0 beta 16A5171c
1
0
343
Aug ’24
how to hide window in vision pro?
when I use openwindow to show a volume window,I want to hide the old window ,keep it stays where it is, when I close the volume window, the old window appears again.the logic is simple, I tried to use opaque to hide the old window, but the window bar is still there, which is very annoying. how could I solve this?
2
1
335
Aug ’24
API Call from inside Vision Pro
I'm trying to hit an API URL, however I am getting this error (501) Invalidation handler invoked, clearing connection (501) personaAttributesForPersonaType for type:0 failed with error Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mobile.usermanagerd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mobile.usermanagerd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.} Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state, NSLocalizedFailureReason=Client not entitled, RBSPermanent=false} elapsedCPUTimeForFrontBoard couldn't generate a task port This is what my info.plist looks like - This is the code I'm using to hit the URL and get a response func sendMessage() { guard let url = URL(string: "https://API_URL") else { return } var request = URLRequest(url: url) request.httpMethod = "POST" request.setValue("application/json", forHTTPHeaderField: "Content-Type") let body = ["query": message] //creates a dictionary with key:value pair request.httpBody = try? JSONSerialization.data(withJSONObject: body) //converts the dictionary to json data and sets as body of request isLoading = true response = "" URLSession.shared.dataTask(with: request) { data, _, error in //initiates async task for sending request DispatchQueue.main.async { //async update of UI on main thread isLoading = false } if let data = data, error == nil, //checks that data was received and that there is no error let json = try? JSONSerialization.jsonObject(with: data) as? [String: String] { //parsing json response DispatchQueue.main.async { response = json["response"] ?? "No response" } } }.resume() // starts the network request } Can anyone help me understand what the errors are and why I'm not able to get the response back?
1
0
274
Aug ’24
Incorrect playback of video with Alpha in AVPlayerViewController
We appear to be experiencing a bug with the latest beta for visionOS, we are attempting to playback a video with a transparent background in the app. In the previous beta playback worked as expected and the transparent parts of the video were transparent. In the latest beta the background appears black. The view we are using in a SwiftUI wrapped version of AVPlayerViewController, we have narrowed the bug down to only occurring only when playback is being presented in the embedded experience mode, if playback is being done in the expanded experience then playback is as expected. This has only only been visible on an actual device, we have been unable to replicate the behaviour in the simulator using the latest Xcode 16.0 beta(beta 5 (16A5221g)) This is sample project that shows off the bug
1
0
425
Aug ’24
Close Home button
In visionOS2, there exists a function that enables users to raise their hand to display the home button. However, this functionality conflicts with the interaction required for the mixed display space utilized within my application. Therefore, I seek a method to disable this functionality.
1
0
328
Aug ’24
ImmersiveSpaceContent has no onAppear/onDisappear callbacks
AFAIK there's no way to programmatically detect when an ImmersiveSpaceContent is dismissed by a user (i.e. by pressing the home button). By comparison, ImmersiveView has .onAppear() and .onDisappear(): ImmersiveSpace(id: appModel.immersiveSpaceID) { ImmersiveView() .environment(appModel) .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } } In comparison: // No similar callbacks for here: struct MyImmersiveSpace: ImmersiveSpaceContent { var body: CompositorLayer { /* ... */ } }
1
0
350
Aug ’24
RealityView in macOS, Skybox, and lighting issue
I am testing RealityView on a Mac, and I am having troubles controlling the lighting. I initially add a red cube, and everything is fine. (see figure 1) I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field. Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was. Is there a way to return the lighting of the model to the original lighting I had before adding the skybox? I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment. Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c) struct MyRealityView: View { @Binding var isSwitchOn: Bool @State private var blueNebulaSkyboxResource: EnvironmentResource? var body: some View { RealityView { content in // Create a red cube 10cm on a side let mesh = MeshResource.generateBox(size: 0.1) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) // Load skybox let blueNeb2Name = "BlueNeb2" blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name) } update: { content in if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) { content.environment = .skybox(blueNebulaSkyboxResource!) } else { content.environment = .default } } .realityViewCameraControls(CameraControls.orbit) } } Figure 1 (default lighting before adding the skybox): Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox): Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
1
0
383
Aug ’24
Assistance Needed with Spatial Playback of 180° 3D Immersive Videos
We’re looking to extend the capabilities of our Apple Vision Pro app to properly support the spatial playback of 180° 3D immersive videos. Currently, when these videos are played back, they are projected onto the entire 360° sphere, which results in a distorted and less-than-optimal experience for the user. Our goal is to ensure that the 180° video content is correctly displayed within the horizontal hemisphere only, rather than across the full sphere. We’re unsure of the best approach to achieve this and would greatly appreciate your guidance. Would it be possible for your team to review our code and provide us with the necessary steps or adjustments needed to achieve the desired playback results? Case-ID: 8729125 Thank you for your assistance.
2
0
322
Aug ’24
Is there a way to make an .objcap file from a .USDZ file
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre. I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject. I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
1
0
499
Aug ’24
Immersive experience from sample
Hi, We are currently building an app for immersive experiences of our custom content. This is displayed from a video on a custom geometry in the immersive on the Vision Pro I have enabled the AVPlayerViewController system controls that detach when entering immersive like in the sample: https://developer.apple.com/documentation/visionos/building-an-immersive-media-viewing-experience For our case, we do not need the 2D screen showing after entering the immersive, only the environment So my question is how to remove the screen with the video and keep the controls, like in the Apple TV app for immersive experiences? Thanks in advance
2
0
452
Aug ’24
Collision Detection after Object Tracking
So I am tracking 2 objects in my scene, and spawning a tiny arrow on each of the objects (this part is working as intended). Inside my scene I have added Collision Components and Physics Body Components to each of the arrows. I want to detect when when a collision occurs between the 2 arrow entities.. I have made the collision boxes big enough so they should definitely be overlapping, however I am not able to detect when the Collision occurs. This is the code that I use for the scene - import SwiftUI import RealityKit import RealityKitContent struct DualObjectTrackingTest: View { @State private var subscription: EventSubscription? var body: some View { RealityView { content in if let immersiveContentEntity = try? await Entity(named: "SceneFind.usda", in: realityKitContentBundle) { content.add(immersiveContentEntity) print("Collision check started") } } update: { content in if let arrow = content.entities.first?.findEntity(named: "WhiteArrow") as? ModelEntity { let subscription = content.subscribe(to: CollisionEvents.Began.self, on: arrow) { collisionEvent in print("Collision has occured") } } } } } All I see in my console logs is "Collision check started" and then whenever I move the 2 objects really close to each other so as to overlap the collision boxes, I don't see any updates in the logs. Can anyone give me some further guidance/resources on this? Thanks again!
5
0
547
Aug ’24