Broadcast extension limit vs large frames on iPad

Hello,

Our use case is Screen sharing in a live video call. We use broadcast extension to capture screens and send frames. The broadcast extension has hard limit of 50MB. The screen sharing works great with iPhones. But on iPad, ReplayKit delivers larger screens, and as a result, the extension memory usages goes beyond 50MB. While using the profiler we noticed, the memory used by our code is <25MB, but on iPad ReplayKit is having memory spikes which causes memory to go beyond 50MB limits.

How should I achieve screen sharing use case on iPads? What is the guideline. Any suggestion/help is appreciated.

Best,
Piyush
I know that Apple strongly encourages the use of VideoToolbox for RT communication, but it is not viable in every use case. Specifically, the VP8 software encoder in WebRTC is very mature and well optimized for screenshare content while the H264 implementation is quite lacking in this area. This is not a slight on VideoToolbox as I think VTCompressionSession does offer adequate control planes.

So far, Piyush and I found that it is possible to use vp8 on iOS devices with some downscaling and fit within the 50 MB limit.

The problem is that ReplayKit.framework seems to be very memory hungry when the device is processing constrained, like when launching apps or using some home screen features during a broadcast. I have observed up to 8 ReplayKit buffers on an iPhone 7 and 6 on an iPad Pro 12.9" Gen 2 leading to memory exhaustion and a resource limit crash. This is speaking for iOS 13.5.1:

Memory peak for ReplayKit = 6 * 1920 * 1440 * 1.5
= 23.7 MB

In practice the extension really only has about 50% of the memory budget when processing gets backed up. The queue that creates the RPIOSurface objects seems to not be dropping the IOSurfaces when the system is backed up. It would be nice if the queue were sized smaller, or if we could ask for smaller sized buffers that take up less memory.
I have the same problem.I want to rotate sampleBuffer, when call
Code Block
render(output, to: pixelBuffer)

get crash on ipad pro en2.
Have you found a solution?

Also found that typing rapidly will cause the memory to grow fast, even using an iPhone device.

Any update on this? I have the same crash on iPad Pro and iPhone 11.

It happens right after the line

render(output, to: pixelBuffer)
Broadcast extension limit vs large frames on iPad
 
 
Q