Error Multitrack Audio AVAudioEngine SwiftUI

Hi, I'm trying to create a multitrack audio Player with AVAudioEngine in SwiftUI. It's basically 4 individual tracks (drums, bass, keys, click) playing at the same time. I've tried to build it using documentation I've found on google, but I'm running into the following error: "Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: [_nodes containsObject: node1] && [_nodes containsObject: node2]'" - can you please help me?


As I'm new to coding, don't hesitate to let me know would be able to play sound.


import SwiftUI
import AVFoundation

class AudioMixer {
  let engine = AVAudioEngine()
  let player = AVAudioPlayer()
  let drumPlayer = AVAudioPlayerNode()
  let bassPlayer = AVAudioPlayerNode()
  let keysPlayer = AVAudioPlayerNode()
  let clickPlayer = AVAudioPlayerNode()
  let mixer = AVAudioMixerNode()


init() {
  let audioSession = AVAudioSession.sharedInstance()
  do {
  try audioSession.setCategory(AVAudioSession.Category.playback, options: .mixWithOthers)
  try audioSession.setActive(true)
 } catch {
  print(error.localizedDescription)
 }




if let drumAudioPath = Bundle.main.path(forResource: "Drums1", ofType: "mp3") {
  let drumsAudioUrl = URL(fileURLWithPath: drumAudioPath)
  let drumsAudioFile = try? AVAudioFile(forReading: drumsAudioUrl)
  engine.connect(drumPlayer, to: mixer, format: drumsAudioFile?.processingFormat)
  drumPlayer.scheduleFile(drumsAudioFile!, at: nil, completionHandler: nil)
}
drumPlayer.play()
engine.attach(drumPlayer)


if let bassAudioPath = Bundle.main.path(forResource: "Bass1", ofType: "mp3") {
  let bassAudioUrl = URL(fileURLWithPath: bassAudioPath)
  let bassAudioFile = try? AVAudioFile(forReading: bassAudioUrl)
  engine.connect(bassPlayer, to: mixer, format: bassAudioFile?.processingFormat)
  bassPlayer.scheduleFile(bassAudioFile!, at: nil, completionHandler: nil)
  }
bassPlayer.play()
engine.attach(bassPlayer)


 if let keysAudioPath = Bundle.main.path(forResource: "Keys1", ofType: "mp3") {
  let keysAudioUrl = URL(fileURLWithPath: keysAudioPath)
  let keysAudioFile = try? AVAudioFile(forReading: keysAudioUrl)
  engine.connect(keysPlayer, to: mixer, format: keysAudioFile?.processingFormat)
  keysPlayer.scheduleFile(keysAudioFile!, at: nil, completionHandler: nil)
  }
keysPlayer.play()
engine.attach(keysPlayer)


 if let clickAudioPath = Bundle.main.path(forResource: "Click1", ofType: "mp3") {
  let clickAudioUrl = URL(fileURLWithPath: clickAudioPath)
  let clickAudioFile = try? AVAudioFile(forReading: clickAudioUrl)
  engine.connect(clickPlayer, to: mixer, format: clickAudioFile?.processingFormat)
  clickPlayer.scheduleFile(clickAudioFile!, at: nil, completionHandler: nil)
  }
clickPlayer.play()
engine.attach(clickPlayer)

engine.attach(mixer)

engine.prepare()
  do {
  try engine.start()
  }
  catch {
  print(error.localizedDescription)
  }
}

Replies

Check out Bob McCune's AudioLooper example app:

https://github.com/tapharmonic/Learning-AV-Foundation-Swift


McCune writes incredibly spare and focused programs. He also specializes in audio programming. If you take the time to study what he has done with AudioLooper, you will not only get the benefit of code that runs, but also learn a lot about how to efficiently structure these types of applications.

  • McCunes audiolooper is in Objective C, the question was about Swift

Add a Comment