In iOS 17 Beta, a new AVSampleBufferVideoRenderer class has been added:
https://developer.apple.com/documentation/avfoundation/avsamplebuffervideorenderer
I'm wondering if this could somehow be used together with AirPlay in order to manually enqueue video sample buffers, just like you already can for AirPlay Audio with AVSampleBufferAudioRenderer (see: https://developer.apple.com/documentation/avfaudio/audio_engine/playing_custom_audio_with_your_own_player).
I want to be able to stream AirPlay Video without HLS.
If I try to add the video renderer to their existing sample project for audio I get exception with message: "... video target must be added to the AVSampleBufferVideoRenderer prior to enqueueing sample buffers.", which I guess makes sense. But since there is no documentation on this yet, I can't know how to add a video target, nor what kind of video targets are supported.
Post
Replies
Boosts
Views
Activity
I have an iOS app that includes Picture in Picture. Everything works great there.
However, when running this same app on macOS (Apple Silicon, Designed for iPad), it does not work properly. Here, it only works in the scenarios where I initiate AVPictureInPictureController with an AVPlayerLayer, but not if I initiate it with a ContentSource of type AVSampleBufferDisplayLayer. In this case, the AVPictureInPictureController.isPictureInPicturePossible observer remains false, and thus PiP cannot be started.
I have double checked that everything is setup properly, and the fact that it runs flawlessly on iOS further confirms this.
Can anyone tell me why it is not working?