Posts

Post not yet marked as solved
4 Replies
2.6k Views
I am trying to render audio using AVSampleBufferAudioRenderer but there is no sound coming from my speakers and there is a repeated log message.[AQ] 405: SSP::Render: CopySlice returned 1I am creating a CMSampleBuffer from an AudioBufferList. This is the relevant code:var sampleBuffer: CMSampleBuffer! try runDarwin(CMSampleBufferCreate(allocator: kCFAllocatorDefault, dataBuffer: nil, dataReady: false, makeDataReadyCallback: nil, refcon: nil, formatDescription: formatDescription, sampleCount: sampleCount, sampleTimingEntryCount: 1, sampleTimingArray: &timingInfo, sampleSizeEntryCount: sampleSizeEntryCount, sampleSizeArray: sampleSizeArray, sampleBufferOut: &sampleBuffer)) try runDarwin(CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer, blockBufferAllocator: kCFAllocatorDefault, blockBufferMemoryAllocator: kCFAllocatorDefault, flags: 0, bufferList: audioBufferList.unsafePointer)) try runDarwin(CMSampleBufferSetDataReady(sampleBuffer))I am pretty confident that my audio format description is correct because CMSampleBufferSetDataBufferFromAudioBufferList, which performs a laundry list of validations, returns no error.I tried to reverse-engineer the CopySlice function, but I’m lost without the parameter names.int ScheduledSlicePlayer::CopySlice( long long, ScheduledSlicePlayer::XScheduledAudioSlice*, int, AudioBufferList&, int, int, bool )Does anyone have any ideas on what’s wrong? For the Apple engineers reading this, can you tell me the parameter names of the CopySlice function so that I can more easily reverse-engineer the function to see what the problem is?
Posted Last updated
.
Post not yet marked as solved
1 Replies
2.6k Views
(For Apple folks: rdar://47577096.)# BackgroundThe Core Video function `CVImageBufferCreateColorSpaceFromAttachments` creates custom color profiles with simplified transfer functions instead of using the standard system color profiles. Let’s take ITU-R 709 as an example.The macOS `Rec. ITU-R BT.709-5` system color profile specifies the transfer function asf(x) = { (0.91x + 0.09)^2.222 where x >= 0.081 { 0.222x where x < 0.081The Apple-custom `HDTV` color profile created by the above Core Video function specifies the transfer function asf(x) = x^1.961My understanding is that `x^1.961` is the closest approximation of the more complex ITU-R 709 transfer function.# Questions1. Why use a custom color profile with a simplified transfer function rather than the official specification? - Was it done for performance? - Was it done for compatibility with non-QuickTime-based applications? - etc.2. Speaking of compatibility, there is a problem when an encoding application uses the official transfer function and the decoding application uses the approximated transfer function. I tested this using two images. One image uses the `Rec. ITU-R BT.709-5` color profile. The other image is derived from the former by assigning the Apple-custom `HDTV` color profile. The latter image loses the details in the darker areas of the image. Why go to the trouble of approximating the transfer function when the approximation isn’t that great?3. Are the Apple-custom color profiles also used for encoding? Or are they only for decoding?4. Another thing that concerns me is that the Apple-custom `HDR (PQ)` and `HDR (HLG)` color profiles use the same simplified transfer function of `f(x) = x^1.801`. Isn’t the whole point of the PQ and HLG standards to define more sophisticated transfer functions? Doesn’t simplifying those two transfer functions defeat their purpose?
Posted Last updated
.
Post not yet marked as solved
0 Replies
532 Views
Hi, I created FB6064639 (aka rdar://50235272) a while ago. I was notified that the issue was fixed in 10.15 beta 5.I can confirm that it was partially fixed. I left some additional comments in the bug report but have not received a response yet. I am posting on Apple Developer Forums in the hopes that someone at Apple might see this.Here is the summary from the original bug report:“Pure black should show as (0, 0, 0). It does so when playing a 10-bit video in QuickTime Player. However, when playing an 8-bit video, it shows as (1, 0, 1) instead. This is especially noticeable on an OLED TV.”10.15 beta 5 fixes the problem when using `AVPlayer` (which is what QuickTime Player uses). However, `AVSampleBufferDisplayLayer` is still broken. I debugged a little and found that they both use `CAImageQueue` and the color spaces that they are specifying in the pixel buffers are the exact same. I am at a loss for what could cause the difference in behavior between these two components.
Posted Last updated
.
Post not yet marked as solved
0 Replies
849 Views
Let me start by saying that HDR is really confusing to me. Please forgive me if I am misunderstanding anything.I have an app that sends decoded video frames to `AVSampleBufferDisplayLayer`. I use the Core Video function `CVImageBufferCreateColorSpaceFromAttachments` to create the `CGColorSpace`, which I then attach to the image buffer. This works great for non-HDR content.However, when I specify an HDR transfer function (PQ or HLG), the Core Video function returns the Generic RGB color space on macOS 10.14. On macOS 10.15, specifying HLG produces a color space with parametric tone response curves configured for Rec. 709 and a “device to PCS conversion table”/“PCS to device conversion table” that has matrix curves corresponding to HLG. Specifying PQ on macOS 10.15 still produces the Generic RGB color space.Questions:1. Is the color space returned for HLG on macOS 10.15 correct? Why are there two different curves in the ICC profile?2. Why does the Core Video function not return a correct color space for PQ (even on macOS 10.15)?3. There is also some metadata for PQ content (mastering display color volume and content light level info). Where do I specify that?4. Does HLG need metadata or is specifying the color space enough?
Posted Last updated
.