The fact that the same code works on macOS 12 and 11 make me think this is macOS bug. We have faced the issue with our WebRTC based app, but as you can see with the example below it does not involve any peers or network connection, it is some sort of webkit audio streaming issue. On JavaScript side everything is fine, stream is connected, active and not muted.
Can anyone please confirm this page works in their WebView on their macOS 13? As soon as you give access to mic, you should hear yourself:
https://webrtc.github.io/samples/src/content/getusermedia/audio/
It works in Safari and on macOS 11 and 12, but I hear no sound on macOS 13.1 I have isolated issue to a simple program. You can find steps to recreate below:
-
Open Xcode and create new App with the following settings: MacOS, XIB, ObjectiveC.
-
In the "App Sandbox Settings" of the new app select “Audio Input” and “Outgoing Connections (Client)” options. This will give app access to microfone and web.
-
Add the following key to “Info.plist” file: “Privacy - Microphone Usage Description” and any text as its value. This is required to request microphone access.
-
In the AppDelegate.m file edit
applicationDidFinishLaunching
function the following way:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
dispatch_async(dispatch_get_main_queue(), ^{
WKWebView *webkit = [WKWebView.alloc initWithFrame:self.window.contentView.frame configuration:WKWebViewConfiguration.alloc.init];
[self.window.contentView addSubview:webkit];
webkit.autoresizingMask = NSViewWidthSizable | NSViewHeightSizable;
[webkit loadRequest:[NSURLRequest requestWithURL: [NSURL URLWithString: @"https://webrtc.github.io/samples/src/content/getusermedia/audio/"]]];
});
}];
}
DONE