No app audio in Broadcast Upload Extension

I am trying to develop a prototype Broadcast Upload Extension. I start screen recording from Control Center, by a long press. So far so good, I was able to get video frames, compress them and write to file. I was also able to get mic audio. Alas, I am not able to get ANY app sound. This is how my RPBroadcastSampleHandler derivative's processSampleBuffer looks like:

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType
{
    switch (sampleBufferType) {
        case RPSampleBufferTypeVideo:
            [self videoHandler: sampleBuffer];
            break;
        case RPSampleBufferTypeAudioApp:
            [self audioHandler: sampleBuffer output: audio_output];
            break;
        case RPSampleBufferTypeAudioMic:
            [self audioHandler: sampleBuffer output: mic_output];
            break;
        default:
            break;
    }
}

Now, if only mic recording is allowed, I get mic data in mic_output, so audioHandler procedure is OK. But I never get anything (zero bytes) in audio_output, so I am thinking that processSampleBuffer simply never gets called with sampleBufferType being equal to RPSampleBufferTypeAudioApp.


At first I thought it must be related to me using YouTube app for sound. So I tried other sound sources (like playing video I recerded in Photos, setting up alarm timer, and so on). Zero bytes of audio still. Do I need to enable audio recording elsewhere? Or maybe I am using wrong constant - but documentation lists just Video, AudioApp and AudioMic, and the other two work.


Is it a bug?


I was using iPhone 7, on both iOS 11.0 and 11.1 (XCode 9.1) and got same results. Please help

Regards

Michal

Replies

Filed a bug report 35415615 "No audio frames when using iOS Broadcast Extension with Control Center-initialized recording"


I think that Replaykit is not compatible AVPlayer, also if you launch screen broadcasting from home launcher, you won't get any app audio.

I don’t that think AVPlayer compatibility is a problem. I am not using it. And I get proper microphone frames, so generally audio works. Moreover, my bug report was linked to another one, marked as Open by Apple. I understand they acknowledge the problem.

MikeAlpha do you have a demo app, even with the sound problem?
I am not able to start the screen recording from command center.

The functions in SampleHandler never got called. Any hint ?

Hey


Sorry for answering so late - there are no notifications :-(


As for your question, well, I did it for the client, and they own the code. But the app worked nicely, I was able to get video frames and encode them using system H264 encoder.


Hint...well, getting anything to work there (in extension) is hard, as (at least in my experience) you cannot debug easily and there is no default output for debug printfs, no UI, etc. What I have done is I created "app group" that linked my extension and the app it was bundled with. App group can have shared directory. I wrote the code that allowed me to log messages into file in that shared directory, and app was displaying them. And it really helped, because with "printf debugging" you're much better off than without any help.

Bug report that my report was a duplicate of has now Closed status. So I just tried with iPhone 7 running iOS 12 Beta. I compiled the code using Xcode Version 10.0 beta (10L176w).


Things got better, in that I got called with RPSampleBufferTypeAudioApp. Buffer has nonzero length, and I can get audio data from it the usual way (via CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer calls). However, only for certain applications I get nonzero data here. I was able to save some streams from YouTube app and Photos app (while playing video). Other applications, including Safari, GarageBand, Timer/Clock send audio buffers, but zeroed out.


As I am getting nonzero samples only when frequency is 22kHz (and it seems to be default ReplayKit frequency) and all zeroed-out sample buffers are described as 44.1kHz, I am guessing that maybe some conversion routines went south?


Anyway, I have filled new bug report 41123390. Hopefully that issue will get addressed, as recent changes in ReplayKit (app audio and also in-app button for starting System-wide recording) look very promising.


Regards

Michal

Hi MikeAlpha,


I am getting `processSampleBuffer` invoked in my extension code and when I examine the pixel buffer inside the processSampleBuffer, it type 420f pixel buffer of size 1080 x 1920. However when I try to get the plane's base address, it returns `nil`.


- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {


switch (sampleBufferType) {

case RPSampleBufferTypeVideo:

CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

void *y_plane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0) // returns nil always

void *uv_plane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1) // returns nil always

break;

...

}



Any idea? I am running iOS 12 Beta. The host app is running the extension using `RPBroadcastPickerView`.


Thanks,

Piyush

Hey Piyush


I looked at my code. I think that you have to call CVPixelBufferLockBaseAddress before you try to get address of plane, and then when you finish dealing with data, you're also supposed to call CVPixelBufferUnlockBaseAddress. At least this is what documentation for CVPixelBufferLockBaseAddress says.


Hope that helps

Mike

Thanks Mike. That was a silly mistake in my code. It works as expected now.

I am working on the cleaning up the code. Once I have it ready, I will post the github link.


Best,

Piyush

hi, ptank.

Where is your code? I got some problems with raw yuv data.

Hi MikeAlpha,


Have you had any responses to your bug report?