Post

Replies

Boosts

Views

Activity

createMediaElementSource() not working with HLS/m3u8 stream in iOS in all browsers
This issue has a blocking impact on our ability to serve our product on any iOS devices (since Web Audio APIs are not supported on any other browsers than Safari on iOS) and Safari browser on desktop and iOS in general. And one of our customer is currently heavily impacted because of this limitation in Safari. Currently Safari built on WebKit has a limitation that it cannot provide access to raw audio data via AudioContext for HLS playback, which works on mp4 files. This is supported by EVERY OTHER MAJOR BROWSER EXCEPT SAFARI, which is concerning because we will need to force users to not use our application on safari desktop, and we simply CANNOT SERVE ANY IPHONE AND IPAD USERS which is a BLOCKER for us given that more than half of our users use iOS based devices. And of course this is clearly a feature that should’ve been in place already in Safari, which is currently lagging behind in comparison to other browsers. The W3C specification already supports this and all major browsers have already implemented and supported HLS streams to be used with AudioContext. We’d like to re-iterate the importance and urgency of this (https://bugs.webkit.org/show_bug.cgi?id=231656) for us, and this has been raised multiple times by other developers as well, so certainly this will help thousands of other Web developers to bring HLS based applications to life on Safari and iOS ecosystem. Can we please get the visibility on what is going to be the plan and timelines for HLS support with AudioContext in Safari? Critical part of our business and our customer’s products depend on this support in Safari. We're using new webkitAudioContext() in Safari 15.0 on MacBook and iOS Safari on iPhone and iPad to create AudioContext instance, and we're creating ScriptProcessorNode and attaching it to the HLS/m3u8 source create using audioContext. createMediaElementSource(). The onaudioprocess callback gets called with the audio data, but no data is processed and instead we get 0’s. If you also connect Analyser node to the same audio source create using audioContext. createMediaElementSource(), analyser.getByteTimeDomainData(dataArray) populates no data in the data but onaudioprocess on the ScriptProcessorNode on the same source What has been tried: We confirmed that the stream being used is the only stream in the tab and createMediaElementSource() was only called once to get the stream. We confirmed that if the stream source is MP4/MP3 it works with no issues and data is received in onaudioprocess, but when modifing the source to HLS/m3u8 it does not work We also tried using MediaRecorder with HLS/m3u8 as the stream source but didn’t get any events or data We also tried to create two AudioContext’s, so the first AudioContext will be the source passing the createMediaElementSource as the destination to the other Audio Context and then pass it to ScriptProcessorNode, but Safari does not allow more than one output. Currently none of the scenarios we tried works and this is a major blocker to us and for our customers. Code sample used to create the ScriptProcessorNode: const AudioContext = window.AudioContext || window.webkitAudioContext; audioContext = new AudioContext(); // Create a MediaElementAudioSourceNode // Feed the HTML Video Element 'VideoElement' into it const audioSource = audioContext.createMediaElementSource(VideoElement); const processor = audioContext.createScriptProcessor(2048, 1, 1); processor.connect(audioContext.destination); processor.onaudioprocess = (e) => { // Does not get called when connected to external microphone // Gets called when using internal MacBook microphone console.log('print audio buffer', e); } The exact same behavior is also observed on iOS Safari on iPhone and iPad. We are asking for your help on this matter ASAP. Thank you!
0
4
1.6k
Nov ’21