getUserMedia don't work with AudioContext.createScriptProcessor

I updated my iphone to ios11 beta4,

I trying to record voice and process it, it work with getUserMedia, and the AudioStream can be get. demo: https://webrtc.github.io/samples/src/content/getusermedia/audio/



but when using with AudioContext:

let ctx = new window.webkitAudioContext;
let source = ctx.createMediaStreamSource(stream);
let processor = ctx.createScriptProcessor(1024, 1, 1);

processor.onaudioprocess = function(e) {
     console.log('do something');
}


the audioprocess event can't be triggered. demo:


https://webrtc.github.io/samples/src/content/getusermedia/volume/

Replies

As with Chrome, you need to connect the node to the destination, i.e.:


processor.connect(ctx.destination);


However this only seems to be working on macOS. On iOS it is still not firing. Logged as a bug.

Actually, it appears the issue is that the user needs to explicitly activate audio output via a touch action. Even though this is for audio input, it still uses AudioContext, which requires explicit touch activation.