Hi,
I'm building an app which allows you to stream video/audio from IP cameras. It uses RTSP and RTP protocols to stream the data from the device.
I'm getting audio packets encoded using G.711 PCMU and I'm trying to play them in my app but I'm facing some issues with that.
Right now, I have some basic implementation with AVAudioEngine and AVAudioPlayerNode but it doesn't seem to work properly (cracking noises)
Setup code for AVAudioEngine:
Scheduling buffers to AVAudioPlayerNode:
Is there anything I'm doing wrong here or something I should be approaching differently?
I'm building an app which allows you to stream video/audio from IP cameras. It uses RTSP and RTP protocols to stream the data from the device.
I'm getting audio packets encoded using G.711 PCMU and I'm trying to play them in my app but I'm facing some issues with that.
Right now, I have some basic implementation with AVAudioEngine and AVAudioPlayerNode but it doesn't seem to work properly (cracking noises)
Setup code for AVAudioEngine:
Code Block Swift let engine = AVAudioEngine() let playerNode = AVAudioPlayerNode() let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 8000, channels: 1, interleaved: false)! engine.attach(playerNode) engine.connect(playerNode, to: engine.mainMixerNode, format: format) engine.prepare() try? engine.start() playerNode.play()
Scheduling buffers to AVAudioPlayerNode:
Code Block Swift var description = AudioStreamBasicDescription(mSampleRate: 8000, mFormatID: kAudioFormatULaw, mFormatFlags: 0, mBytesPerPacket: 160, mFramesPerPacket: 1, mBytesPerFrame: 160, mChannelsPerFrame: 1, mBitsPerChannel: 8, mReserved: 0) let format = AVAudioFormat(streamDescription: &description)! let values = ulawToPcm16(table: self.table, source: packet.data) let data = Data(bytes: values, count: values.count) if let buffer = data.makePCMBuffer(format: format) { playerNode.scheduleBuffer(buffer, completionHandler: nil) }
Is there anything I'm doing wrong here or something I should be approaching differently?