Post

Replies

Boosts

Views

Activity

Audio renderer fails to render CMSampleBuffer
I am trying to render audio using AVSampleBufferAudioRenderer but there is no sound coming from my speakers and there is a repeated log message.[AQ] 405: SSP::Render: CopySlice returned 1I am creating a CMSampleBuffer from an AudioBufferList. This is the relevant code:var sampleBuffer: CMSampleBuffer! try runDarwin(CMSampleBufferCreate(allocator: kCFAllocatorDefault, dataBuffer: nil, dataReady: false, makeDataReadyCallback: nil, refcon: nil, formatDescription: formatDescription, sampleCount: sampleCount, sampleTimingEntryCount: 1, sampleTimingArray: &timingInfo, sampleSizeEntryCount: sampleSizeEntryCount, sampleSizeArray: sampleSizeArray, sampleBufferOut: &sampleBuffer)) try runDarwin(CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer, blockBufferAllocator: kCFAllocatorDefault, blockBufferMemoryAllocator: kCFAllocatorDefault, flags: 0, bufferList: audioBufferList.unsafePointer)) try runDarwin(CMSampleBufferSetDataReady(sampleBuffer))I am pretty confident that my audio format description is correct because CMSampleBufferSetDataBufferFromAudioBufferList, which performs a laundry list of validations, returns no error.I tried to reverse-engineer the CopySlice function, but I’m lost without the parameter names.int ScheduledSlicePlayer::CopySlice( long long, ScheduledSlicePlayer::XScheduledAudioSlice*, int, AudioBufferList&, int, int, bool )Does anyone have any ideas on what’s wrong? For the Apple engineers reading this, can you tell me the parameter names of the CopySlice function so that I can more easily reverse-engineer the function to see what the problem is?
4
1
2.9k
Sep ’18
Why does Apple use simplified 1.961 gamma instead of precise ITU-R 709 transfer function?
(For Apple folks: rdar://47577096.)# BackgroundThe Core Video function `CVImageBufferCreateColorSpaceFromAttachments` creates custom color profiles with simplified transfer functions instead of using the standard system color profiles. Let’s take ITU-R 709 as an example.The macOS `Rec. ITU-R BT.709-5` system color profile specifies the transfer function asf(x) = { (0.91x + 0.09)^2.222 where x >= 0.081 { 0.222x where x < 0.081The Apple-custom `HDTV` color profile created by the above Core Video function specifies the transfer function asf(x) = x^1.961My understanding is that `x^1.961` is the closest approximation of the more complex ITU-R 709 transfer function.# Questions1. Why use a custom color profile with a simplified transfer function rather than the official specification? - Was it done for performance? - Was it done for compatibility with non-QuickTime-based applications? - etc.2. Speaking of compatibility, there is a problem when an encoding application uses the official transfer function and the decoding application uses the approximated transfer function. I tested this using two images. One image uses the `Rec. ITU-R BT.709-5` color profile. The other image is derived from the former by assigning the Apple-custom `HDTV` color profile. The latter image loses the details in the darker areas of the image. Why go to the trouble of approximating the transfer function when the approximation isn’t that great?3. Are the Apple-custom color profiles also used for encoding? Or are they only for decoding?4. Another thing that concerns me is that the Apple-custom `HDR (PQ)` and `HDR (HLG)` color profiles use the same simplified transfer function of `f(x) = x^1.801`. Isn’t the whole point of the PQ and HLG standards to define more sophisticated transfer functions? Doesn’t simplifying those two transfer functions defeat their purpose?
1
1
3.1k
Feb ’19