I am trying to develop a prototype Broadcast Upload Extension. I start screen recording from Control Center, by a long press. So far so good, I was able to get video frames, compress them and write to file. I was also able to get mic audio. Alas, I am not able to get ANY app sound. This is how my RPBroadcastSampleHandler derivative's processSampleBuffer looks like:
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType
{
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
[self videoHandler: sampleBuffer];
break;
case RPSampleBufferTypeAudioApp:
[self audioHandler: sampleBuffer output: audio_output];
break;
case RPSampleBufferTypeAudioMic:
[self audioHandler: sampleBuffer output: mic_output];
break;
default:
break;
}
}
Now, if only mic recording is allowed, I get mic data in mic_output, so audioHandler procedure is OK. But I never get anything (zero bytes) in audio_output, so I am thinking that processSampleBuffer simply never gets called with sampleBufferType being equal to RPSampleBufferTypeAudioApp.
At first I thought it must be related to me using YouTube app for sound. So I tried other sound sources (like playing video I recerded in Photos, setting up alarm timer, and so on). Zero bytes of audio still. Do I need to enable audio recording elsewhere? Or maybe I am using wrong constant - but documentation lists just Video, AudioApp and AudioMic, and the other two work.
Is it a bug?
I was using iPhone 7, on both iOS 11.0 and 11.1 (XCode 9.1) and got same results. Please help
Regards
Michal