Post

Replies

Boosts

Views

Activity

Spatial streaming from iPhone
Hi, I am trying to stream spatial video in realtime from my iPhone 16. I am able to record spatial video as a file output using: let videoDeviceOutput = AVCaptureMovieFileOutput() However, when I try to grab the raw sample buffer, it doesn't include any spatial information: let captureOutput = AVCaptureVideoDataOutput() //when init camera session.addOutput(captureOutput) captureOutput.setSampleBufferDelegate(self, queue: sessionQueue) //finally func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { //use sample buffer (but no spatial data available here) } Is this how it's supposed to work or maybe I am missing something? this video: https://developer.apple.com/videos/play/wwdc2023/10071 gives us a clue towards setting up spatial streaming and I've got the backend all ready for 3D HLS streaming. Now I am only stuck at how to send the video stream to my server.
1
0
321
Oct ’24
VideoMaterial to display SBS Stereoscopic 3D video? [VisionOS]
Hi, I love VideoMaterial API that gives so much power to play video on any mesh. But I am trying to play a side-by-side 3D video usingVideoMaterial: RealityView { content in let mesh = MeshResource.generatePlane(width: 300.0, height: 300.0, cornerRadius: 0) //generate mesh let vidMaterial = VideoMaterial(avPlayer: AVPlayer(url: URL(string: "https://someurl/test/master.m3u8")!)) //VideoMaterial vidMaterial.controller.preferredViewingMode = .stereo //<-- no idea why it doesn't work for SBS video in simulator vidMaterial.avPlayer?.play() let planeEntity = Entity() //new entity planeEntity.components.set(ModelComponent(mesh: mesh, materials: [vidMaterial])) //set a new ModelComponent to the entity content.add(planeEntity) } this code works well for plain 2D video playback but how do I display a Side-by-Side or Top-Bottom 3D video? I found GeometrySwitchCameraIndex in custom ShaderGraphMaterial but if I use input node as a image texture then how do I pass the video frame as texture into my custom shader to achieve the 3D effect or maybe there is an even better way to deal with this? There seems to be additional API .preferredViewingMode on the VideoMaterial's controller that can be set to .stereo but it doesn't give any stereo effect. Perhaps it's only for MV-HEVC media playback?
1
0
649
Jul ’24
TransferRepresentation slow transfer for large video files.
Hi, I notice a very slow transfer rate when I try to transfer a file picked via .photosPicker. This happens especially when I try to import a 4k/60fps video. SwiftUI: VStack { Button("Pick a video") { model.isPhotoPickerView = true } .photosPicker(isPresented: $model.isPhotoPickerView, selection: $model.selectedImageList, maxSelectionCount: 1, matching: .videos) .onChange(of: model.selectedImageList, { old, new in model.handlePhotoPicker() }) } View Model to handle Photo Picker action: private class PageModel: ObservableObject { //other methods @MainActor public func handlePhotoPicker() { if selectedImageList.isEmpty { return } guard let item = selectedImageList.first else { return } Task { do { if let video = try await item.loadTransferable(type: VideoTransferable.self) { let file = video.url //video url arrived } } catch { //handle error } } } } Now the VideoTransferable : struct VideoTransferable: Transferable { let url: URL static var transferRepresentation: some TransferRepresentation { FileRepresentation(contentType: .movie) { video in SentTransferredFile(video.url) } importing: { received in //takes too much time to import large 4K video recorded from iPhone's main camera let copy = FileManager.documentsDirectory.appendingPathComponent(FolderType.temp.rawValue).appendingPathComponent("video_\(Int64(Date().timeIntervalSince1970 * 1000)).MOV") if FileManager.default.fileExists(atPath: copy.path) { try FileManager.default.removeItem(at: copy) } try FileManager.default.copyItem(at: received.file, to: copy) return Self.init(url: copy) } } } To my surprise this issue doesn't happen when I use a custom UIViewControllerRepresentable to wrap UIImagePickerController() and setting videoExportPreset property of the picker to AVAssetExportPresetPassthrough Can someone point me out where I am wrong?
1
0
433
Jul ’24
UIGraphicsImageRenderer memory issue (iOS 17.5.1)
Testing on iPhone 12 mini, I have encountered a weird situation. I am try to take snapshot of my view, which works fine but the memory is never released after the snapshot is taken. func screenshot(view: UIView, scale:Double) -&gt; URL? { guard let containerView = view.superview, let containerSuperview = containerView.superview else { return nil } let rendererFormat = UIGraphicsImageRendererFormat() rendererFormat.scale = scale var renderer = UIGraphicsImageRenderer(bounds: containerView.frame, format: rendererFormat) let image = autoreleasepool { return renderer.image { context in containerSuperview.drawHierarchy(in: containerSuperview.layer.frame, afterScreenUpdates: true) //memory hog starts from here } } guard let data = image.heicData() else { return nil } //more code to save data to file URL and return it } initially it appears to work normally but as soon as I change the scale: rendererFormat.scale = 10 I can see a spike in memory but the problem is then the memory is never released even after the image is saved. so initially, the app uses: 35MB memory -&gt; when processing the memory usage jumps to expected 250MB to 300MB to process large image -&gt; after processing the memory goes down to around 90MB to 120MB but it never really returns to it's original 35MB state. Is this a bug or this is expected? If this is expected behaviour then is there any low level API to free the memory after it's job is done.
1
1
533
Jun ’24