We do custom audio buffering in our app. A very minimal version of the relevant code would look something like:
import AVFoundation
class LoopingBuffer {
private var playerNode: AVAudioPlayerNode
private var audioFile: AVAudioFile
init(playerNode: AVAudioPlayerNode, audioFile: AVAudioFile) {
self.playerNode = playerNode
self.audioFile = audioFile
}
func scheduleBuffer(_ frames: AVAudioFrameCount) async {
let audioBuffer = AVAudioPCMBuffer(
pcmFormat: audioFile.processingFormat,
frameCapacity: frames
)!
try! audioFile.read(into: audioBuffer, frameCount: frames)
await playerNode.scheduleBuffer(audioBuffer, completionCallbackType: .dataConsumed)
}
}
We are in the migration process to swift 6 concurrency and have moved a lot of our audio code into a global actor. So now we have something along the lines of
import AVFoundation
@globalActor public actor AudioActor: GlobalActor {
public static let shared = AudioActor()
}
@AudioActor
class LoopingBuffer {
private var playerNode: AVAudioPlayerNode
private var audioFile: AVAudioFile
init(playerNode: AVAudioPlayerNode, audioFile: AVAudioFile) {
self.playerNode = playerNode
self.audioFile = audioFile
}
func scheduleBuffer(_ frames: AVAudioFrameCount) async {
let audioBuffer = AVAudioPCMBuffer(
pcmFormat: audioFile.processingFormat,
frameCapacity: frames
)!
try! audioFile.read(into: audioBuffer, frameCount: frames)
await playerNode.scheduleBuffer(audioBuffer, completionCallbackType: .dataConsumed)
}
}
Unfortunately this now causes an error:
error: sending 'audioBuffer' risks causing data races
| `- note: sending global actor 'AudioActor'-isolated 'audioBuffer' to nonisolated instance method 'scheduleBuffer(_:completionCallbackType:)' risks causing data races between nonisolated and global actor 'AudioActor'-isolated uses
I understand the error, what I don't understand is how I can safely use this API?
AVAudioPlayerNode
is not marked as@MainActor
so it seems like it should be safe to schedule a buffer from a custom actor as long as we don't send it anywhere else. Is that right?AVAudioPCMBuffer
is notSendable
so I don't think it's possible to make this callsite ever work from an isolated context. Even forgetting about the custom actor, if you instead annotate the class with@MainActor
the same error is still present.- I think the
AVAudioPlayerNode.scheduleBuffer()
function should have asending
annotation to make clear that the buffer can't be used after it's sent. I think that would make the compiler happy but I'm not certain. - Am I overlooking something, holding it wrong, or is this API just pretty much unusable in Swift 6?
- My current workaround is just to import
AVFoundation
with@preconcurrency
but it feels dirty and I am worried there may be a real issue here that I am missing in doing so.
AVAudioPCMBuffer
s are essentially wrappers around AudioBufferList
s, exposing the latter's data pointers. In this sense they are mutable reference types, and therefore not inherently Sendable
. If your implementation is taking care of isolating the buffers from data races, then marking them as @unchecked Sendable
is already the mechanism to tell the compiler exactly that. It is arguably better than marking all of AVFoundation with @preconcurrency
.