Processing / tapping an HLS audio stream (or global app output)

I'm trying to do some realtime audio processing on audio served from an HLS stream (i.e. an AVPlayer created using an M3U HTTP URL). It doesn't seem like attaching an AVAudioMix configured with with an `audioTapProcessor` has any effect; none of the callbacks except `init` are being invoked. Is this a known limitation? If so, is this documented somewhere?


If the above is a limitation, what are my options using some of the other audio APIs? I looked into `AVAudioEngine` as well but it doesn't seem like there's any way I can configure any of the input node types to use an HLS stream. Am I wrong? Are there lower level APIs available to play HLS streams that provide the necessary hooks?


Alternatively, is there some generic way to tap into all audio being output by my app regardless of its source?


Thanks a lot!

Answered by theanalogkid in 139608022

The MTAudioProcessingTap is not available with HTTP live streaming. I suggest filing an enhancement if this feature is important to you - and it's usually helpful to describe the type of app you're trying to design and how this feature would be used.

Bumping this & jumping on the notifications, as I too would really like to know if this is possible.

Accepted Answer

The MTAudioProcessingTap is not available with HTTP live streaming. I suggest filing an enhancement if this feature is important to you - and it's usually helpful to describe the type of app you're trying to design and how this feature would be used.

Hi theanalogkid,We are developing VR video player and recently we want to support spatial audio. Our player is based on AVPlayer to play HLS video (m3u8), which is 360 video with Ambisonics (https://en.wikipedia.org/wiki/Ambisonics). To process Ambisonics, we need a path to get PCM audio (MTAudioProcessingTAP) for hls stream, but it is not available as c0delogic reported. So is there plan to update this?

Hi, I am also working AVPlayer to play HLS audio (m3u8), and put 3D surround processing as like as gaudiolab, how can I put MTAudioProcessingTAP for HLS stream and let me know updates.
Same here.

We are developing an AR application and we want to capture the AR experience (HLS video with audio attached to an object).

As a hotfix we are capturing the audio over the speakers via the microphone and mixing it with a screen capture afterwards into one media file. Not very satisfying in terms of UX, quality and performance.
I am attempting to do visualizations (audio volume, visualized pitch and the like) in real time, and could use this ability.
Same here. We're trying to do simple capture of LPCM sample flowing into the hardware.

Any news??

We have got no update yet?

Processing / tapping an HLS audio stream (or global app output)
 
 
Q