I am trying to record a video from my camera while an ARView is running and I was wondering how I could do this, as documentation on doing this with an arview is sparse to none. I only need the actual camera feed, not the content that my arview is presenting on top. I know I can get a CVPixelBuffer for each frame of the ARView, but I don't quite understand how to package this up into a video and save it in documents.
I have this code to manage updates from my ARView in my SwiftUI View:
arView.scene.subscribe(to: SceneEvents.Update.self) { _ in
if recording {
//do stuff if record button is pushed
}
}.store(in: &subs)
What do I need to do to package up all the CVPixelBuffers as a video once the stop recording button is pushed? Totally new in this field so any help is greatly appreciated!