I am developing an immersive visionOS app based on RealityKit and SwiftUI. This app has ModelEntities that have a PerspectiveCamera entity as child. I want to display the camera view in a 2D window in visionOS.
I am creating the camera, and add it to the entity with
let cameraEntity = PerspectiveCamera()
cameraEntity.camera.far = 10000
cameraEntity.camera.fieldOfViewInDegrees = 60
cameraEntity.camera.near = 0.01
entity.addChild(cameraEntity)
My app is not AR. The immersive view is programmatically generated. In iOS, I could use an ARView with non AR camera mode. However, ARView is not available in visionOS.
How can I show the camera view in a SwiftUI 2D window in the immersive space?
Hey @Reinhard_Maenner,
"""
- I cannot imagine that one can define a camera without a way to get access to its output. But if this is currently really the case, please confirm it, and I will write a bug report.
"""
That is in fact the case, please file an enhancement request using Feedback Assistant to request the functionality you are looking for.
"""
- What is an existing Metal workflow (I don't have experience in Metal programming).
"""
RealityRenderer renders its scene to a MTLTexture. You can use that MTLTexture in an "existing Metal workflow", whether that be to use as a source for a DrawableQueue or a LowLevelTexture in RealityKit, or as the presentation source for a CAMetalLayer.
"""
- How do I set up the RealityRenderer
"""
Here is a minimal code example I had on hand:
@Observable
@MainActor
final class OffscreenRenderModel {
private let renderer: RealityRenderer
private let colorTexture: MTLTexture
init(scene: Entity) throws {
renderer = try RealityRenderer()
renderer.entities.append(scene)
let camera = PerspectiveCamera()
renderer.activeCamera = camera
renderer.entities.append(camera)
let textureDesc = MTLTextureDescriptor()
textureDesc.pixelFormat = .rgba8Unorm
textureDesc.width = 512
textureDesc.height = 512
textureDesc.usage = [.renderTarget, .shaderRead]
let device = MTLCreateSystemDefaultDevice()!
colorTexture = device.makeTexture(descriptor: textureDesc)!
}
func render() throws {
let cameraOutputDesc = RealityRenderer.CameraOutput.Descriptor.singleProjection(colorTexture: colorTexture)
let cameraOutput = try RealityRenderer.CameraOutput(cameraOutputDesc)
try renderer.updateAndRender(deltaTime: 0.1, cameraOutput: cameraOutput, onComplete: { renderer in
guard let colorTexture = cameraOutput.colorTextures.first else { fatalError() }
// The colorTexture holds the rendered scene.
})
}
}
""" how do I use its output to display it in a 2D SwiftUI window in a immersive space? """ You can use the colorTexture in conjunction with a CAMetalLayer.
Lastly, I will say that, given you stated you are new to Metal, I strongly recommend that you start by reading over some of the documentation before you dive into this implementation, specifically the Essentials, GPU Devices, Resources, and Presentation sections.