:(
We are currently in the process of developing a video calling app using WebRTC.
We initiate one-to-one video calls with the AVAudioSession configured as follows:
do {
if audioSession.category != .playAndRecord {
try audioSession.setCategory(
AVAudioSession.Category.playAndRecord,
options: [
.defaultToSpeaker
]
)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
}
if audioSession.mode != .videoChat {
try audioSession.setMode(.videoChat)
}
} catch {
logger.error(msg: "AVAudioSession: \(error.localizedDescription)")
}
After initiating a video call, we recorded this app's video call using the iOS default screen recording feature.
As a result, the recorded video includes system audio.
However, iOS/iPad apps with similar features (Zoom, Skype, Slack) do not include audio in their recordings.
Why does this difference occur?
Is this behavior a security feature of iOS, and are there specific conditions required?
Is there a need for some sort of configuration in AVAudioSession?
additional :(
I also reached out to Apple Developer Technical Support, and they responded, "We were able to reproduce it, but since we don't understand the issue, we will investigate it."
What's that about...