I want to see the vision pro camera view in my application window. I had write some code from apple, I stuck on CVPixelBuffer , How to convert pixelbuffer to video frame?
Button("Camera Feed") {
Task{
if #available(visionOS 2.0, *) {
let formats = CameraVideoFormat.supportedVideoFormats(for: .main, cameraPositions:[.left])
let cameraFrameProvider = CameraFrameProvider()
var arKitSession = ARKitSession()
var pixelBuffer: CVPixelBuffer?
await arKitSession.queryAuthorization(for: [.cameraAccess])
do {
try await arKitSession.run([cameraFrameProvider])
} catch {
return
}
guard let cameraFrameUpdates =
cameraFrameProvider.cameraFrameUpdates(for: formats[0]) else {
return
}
for await cameraFrame in cameraFrameUpdates {
guard let mainCameraSample = cameraFrame.sample(for: .left) else {
continue
}
//====
print("=========================")
print(mainCameraSample.pixelBuffer)
print("=========================")
// self.pixelBuffer = mainCameraSample.pixelBuffer
}
} else {
// Fallback on earlier versions
}
}
}
- I want to convert "mainCameraSample.pixelBuffer" in to video. Could you please guide me!!
There's a few ways you can get this to work, the easiest being to just convert that pixel buffer to a CIImage with init(cvPixelBuffer:), display that in your app, and update it frequently whenever a new image comes in.
Since the code above is triggered only on a button-push, that would make sense for you.
You can also create an AVPlayer that plays a video from an AVAssetWriter. This allows you to feed pixel buffers into a video stream.
This is a fair bit more complex, but once set-up it will smoothly show your Vision Pro camera feed.