Hi all, I am a graduate student who is looking into making MV-HEVC videos streamable. May I ask that is it possible to encode mv-hevc videos with the HLS (Http Live Streaming) protocol?
I've been trying to use the HLS tool by Apple to encode a spatial video recored by VP.
mediafilesegmenter -iso-fragmented -t 4 -f sp_video-1-vp spatial-video-by-vp.MOV
But the output HLS playlist file doesn't look like the format that Apple proposes in the WWDC video. For example, the attribute EXT-X-VERSION is 7 instead of 12, and no attr REQ-VIDEO-LAYOUT=CH-STEREO which should be the key indicator of the spatial video type.
From what the WWDC video showcased, I assume Apple's HLS tool supports it. Maybe my usage is not correct. Just curious what you guys think about it, thank you!
Post
Replies
Boosts
Views
Activity
Hi everyone, I am having trouble implementing spatial video recording into files by following the WWDC24 video: Build compelling spatial photo and video experiences. Specifically, the flag "isSpatialVideoCaptureSupported" of AVCaptureMovieFileOutput shows FALSE where the code is tested on both my physical iPhone 15 Pro (iOS 18.1) and the simulator (iOS 18.0).
This is the code that I am running:
let movieFileOutput = AVCaptureMovieFileOutput()
print("movieCapture output isSpatialVideoCaptureSupported: \(movieFileOutput.isSpatialVideoCaptureSupported)")
However, one of the formats of AVCaptureDevice shows a TRUE for the flag isSpatialVideoCaptureSupported.
for format in currentDevice.formats {
if format.isSpatialVideoCaptureSupported {
print("isSpatialVideoCaptureSupported is true")
break
}
}
I am totally confused now, why DOES the camera device support spatial mode while the movieFileCapture DOES NOT? Can someone please help? Really appreciate it!!
Here are my testing environment:
iPhone 15 Pro iOS 18.1 (US version)
Xcode 16.0 beta 16A5171c