Thanks for the response. So just to give a background of the issue, I am implementing an editing timeline to mix videos & photos which can be zoomed in/out. At the highest zoom level you can scrub through individual frames of AVComposition. So I was thinking of mapping individual points in timeline to time in composition, where time is measured in number of frames, not seconds or milliseconds. I am not sure if this model of representation of time is reliable enough with variable frame rate videos, even if I set AVVideoComposition frameDuration property to CMTime(1,30) or CMTime(1,25).
Post
Replies
Boosts
Views
Activity
I too have no idea how this works. I had to translate the image after cropping.
Did you find out the answer?
@bford and other AVFoundation Engineers, can you please respond to this? How is Clips app by Apple recording in 10 bit Dolby Vision HDR format?
I see 4 new types of AVCaptureDeviceFormats with codecType 'x420'. Does x420 stands for extended 10 bit buffers?
--Format <AVCaptureDeviceFormat: 0x283b9c520 'vide'/'x420' 3840x2160, { 1- 30 fps}, HRSI:4224x2376, fov:70.291, supports vis, max zoom:129.12 (upscales @1.00), AF System:2, ISO:33.0-3168.0, SS:0.000014-1.000000, supports wide color, supports multicam>, AVMediaType(_rawValue: vide), <CMVideoFormatDescription 0x28375a190 [0x1f5b24660]> {
mediaType:'vide'
mediaSubType:'x420'
mediaSpecific: {
codecType: 'x420' dimensions: 3840 x 2160
}
extensions: {(null)}
}
The video composition processor interrogates video composition instructions to see if each containsTweening to figure out whether this is necessary. If not, and if the frame duration of the composition is lower (= higher frame rate) than the source tracks, it may skip rendering, which you could think of as dropping frames — frames that would be duplicates anyhow. So in other words, if I understand you correctly, AVFoundation guarantees constant frame rate conversion for variable frame rate videos -- both in playback and rendering. Correct?
It's not supported yet, however there has been an update of Clips app by Apple yesterday where they support recording HDR videos. Not sure how.
@ricob Thanks for the response. To tell my requirements, I was evaluating whether it is possible to use UICollectionView to implement a video editing timeline like in Final Cut Pro (https://help.apple.com/assets/5D6EB1CB094622CB34375CC7/5D6EB1D4094622CB34375CDC/en_US/ada7bd21a661acc9b512ccec5b9730d0.png ). I am getting convinced to use UIScrollView rather than UICollectionView. If you think I need to file a TSI, I would do so. Otherwise if you can guide me on the right path, that would be nice too.
I browsed through most of the APIs and it's not clear if AVCaptureVideoDataOutput will support delivery of 10 bit sample buffers BT2020 color space. It's not therefore clear if third party apps can record videos in HDR format with whatever is available so far.
Even I am getting this. No idea what I am doing wrong.
Noticed again on iOS 14.2 beta 2 on iPhone 11 Pro. Apps fail to open, deleting or offloading apps fails, space remains stuck at 63.6 GB/64 GB.
Am I the only one facing these issues, or there is a workaround perhaps? Apple Engineers, With these bugs it is impossible to use PHPickerViewController for multiselection.
Here is a question on SO with a GIF demonstrating the problem:
PHPickerViewController Buggy UI in MultiSelect Mode - https://stackoverflow.com/questions/64232158/phpickerviewcontroller-buggy-ui-flow-in-multi-select-mode
Were you able to find a solution to this? I am stuck with the same issue, and more than that when it works it is awfully slow to load videos.
Were you able to find a solution?
Are we supposed to copy these files locally? Because it doesn't loads in AVPlayer.