Does iOS mix audio files differently to Android?

I have come across an issue, where in the app we user AVAudioPlayerNode to play 2 audio files which are connected to the AVAudioEngine.mainMixerNode, the output for the audio files are not at the same volume. The one file is played softer than the other, so we can clearly hear one of the audio files.


func startDebugPlayback(withDigits digits: [Int], atDigitVolume digitVol: Int, atNoiseVolume noiseVol: Int, loop: Bool) {
        let noiseFileStr = self.config.noiseFiles[abs(noiseVol / 2)]
        print(noiseFileStr)
        noiseFileUrl = Bundle.main.url(forResource: noiseFileStr, withExtension: "m4a")
        
        digits.forEach { (digit) in
            let digitFileStr = self.config.digitFiles[digit][abs(digitVol / 2)]
            print(digitFileStr)
            digitFileUrl = Bundle.main.url(forResource: digitFileStr, withExtension: "m4a")
        }

        //the format for both audio files are the same: sampleRate: 44100, channels: 2, interleaved: true
        
        self.audioEngine.attach(noisePlayer)
        self.audioEngine.connect(noisePlayer, to: audioEngine.mainMixerNode, format: noiseAudioFormat)
        self.audioEngine.attach(debugPlayer)
        self.audioEngine.connect(debugPlayer, to: audioEngine.mainMixerNode, format: digitAudioFormat)
        
        self.audioEngine.prepare()
        
        do {
            try self.audioEngine.start()
        } catch let error {
            print("Error starting up engine: \(error.localizedDescription)")
        }
        
        guard let noiseBuffer = self.noiseFileBuffer else { print("Error with noise file buffer"); return }
        noisePlayer.scheduleBuffer(noiseBuffer, at: nil, options: .loops, completionHandler: nil)
        
        audioEngine.mainMixerNode.volume = AVAudioSession.sharedInstance().outputVolume
        noisePlayer.volume = AVAudioSession.sharedInstance().outputVolume
        debugPlayer.volume = AVAudioSession.sharedInstance().outputVolume
        
        ///Need to setup debug player the count and step should be as per the digits array received
        scheduleDigitBufferForDebug(loop: loop)


        audioEngine.prepare()
            
        debugPlayer.play()
        noisePlayer.play()
    }


Is there prioritizaton on the AVAudioMixerNode?


I have tried attaching each of the players to their own mixer before attaching to mainMixer. (No difference)

I have tried attaching both players to another mixer before attaching to mainMixer. (No difference)


If we get iOS and Android to play the exact same files. There is a clear difference where one file is played softer than the other.


Is there a setting or something I'm missing when mixing the 2 audio files?