Posts

Post not yet marked as solved
4 Replies
726 Views
I use for my use case the Network.Framework API. My use case is the following: start a UDP connection on a specific port send UDP data to this host (A) get maybe data (its's optional) from this host (A) and receive data (on same port as above) from another host (B) (must-have) It's possible to build this scenario with Network.Framework? With my current code I get no data from the another host (B), it's only possible to receive data from host (A). Here is my code: let endpoint = NWEndpoint.hostPort(host: NWEndpoint.Host.init("1.2.3.4"), port: NWEndpoint.Port(rawValue: UInt16(35000))!) let params = NWParameters(dtls: nil, udp: .init()) params.requiredLocalEndpoint = NWEndpoint.hostPort(host: .ipv4(.any), port: 40000) connection = NWConnection(to: endpoint, using: params) connection.start(queue: queue) // after the connection is ready (in state update handler) // I start the receive code like this: connection.receiveMessage { (data, _, isComplete, error) This code works fine for the steps 1-3 from my scenario, but not for the step 4. IMHO the NF-API create a point-to-point connection the remote site. But I want a point-to-multipoint connection on my local site (receive data from two host or more). Is this possible?
Posted
by Tim8872.
Last updated
.
Post not yet marked as solved
2 Replies
1.6k Views
I want to stream audio on iOS and use for that use-case the AVAudioEngine. So, currently I'm not really sure, what is the best solution for my problem.I get the RTP data from the network and want playback this audio data with AVAudioEngine. I use the iOS Network.Framework to receive the network data. Then first I decode my voice data and want to playback it, now.Here is my receive code:connection.receiveMessage { (data, context, isComplete, error) in if isComplete { // decode the raw network data with Audio codec G711 let decodedData = AudioDecoder.decode(enc: data, frames: 160) // create PCMBuffer for audio data for playback let format = AVAudioFormat(settings: [AVFormatIDKey: NSNumber(value: kAudioFormatALaw), AVSampleRateKey: 8000, AVNumberOfChannelsKey: 1]) // let format = AVAudioFormat(standardFormatWithSampleRate: 8000, channels: 1) let buffer = AVAudioPCMBuffer(pcmFormat: format!, frameCapacity: 160)! buffer.frameLength = buffer.frameCapacity // TODO: now I have to copy the decodedData --> buffer (AVAudioPCMBuffer) if error == nil { // recall receive() for next message self.receive(on: connection) } }?How I have to copy my decoded data into the AVAudioPCMBuffer? Currently, my AVAudioPCMBuffer is created, but not contain any audio data.Background information: My generell approach would be to cash here in the above code (at ToDo-Line) the PCMBuffer in a collection and playback this collection by the AVAudioEngine (in a background thread).My decoded linear data is cashed in an array from type Int16. So the var decodedData is from type [Int16], maybe there is a possibility to consume this data directly? The function scheduleBuffer allows only AVAudioPCMBuffer as input.
Posted
by Tim8872.
Last updated
.
Post marked as solved
10 Replies
2.0k Views
Hello, it's possible to get the own socket port for NWConnection? I open a network connection and will find out the used port on this created socket. My example code: let queue = DispatchQueue(label: "UDP data thread") let connection = NWConnection(host: "1.2.3.4", port: 40000, using: .udp) connection.start(queue: queue) IMHO it's totally simple code. Now I want to now the used socket port for my connection. Where I can read this information? Many thanks.
Posted
by Tim8872.
Last updated
.
Post not yet marked as solved
0 Replies
538 Views
Hello, I want to switch my audio session to the external speaker. On iOS 12 it works fine but on on iOS 13. Here is my audio setup code: do { 		try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: .mixWithOthers) 		try AVAudioSession.sharedInstance().overrideOutputAudioPort(.speaker) 		try AVAudioSession.sharedInstance().setActive(true) } catch { 		print(error) } This code above works fine, I start my audio session with the external speaker on the device. Now I want to switch the audio from external speaker to the internal speaker during an active session. Therefor I use this simple line: try AVAudioSession.sharedInstance().overrideOutputAudioPort(.none) The result is: on iOS12 it works fine, but on on iOS 13 Is there an API change or something else? What is my mistake? Maybe I have to use another API from ObjC or deeper one from the C layer? Or work here with AudioUnit? For the playback I use now the AVAudioEngine (after the speaker switch it plays no audio).
Posted
by Tim8872.
Last updated
.
Post not yet marked as solved
0 Replies
979 Views
I want to playback audio data received from the network. My incoming audio PCM data are in the format Int16, 1 channel, sample rate 8000, 160 bytes/packageNow I'm not sure, which audio format iOS is supporting on the speaker side? IMHO I have to work with Float32 and the sample rate 44.100 / 48000 is that right?So I think I have to convert my Int16 linear PCM data to Float32. Maybe I have also tu resample the data from 8k to 48k,I'm not sure (maybe the hardware does it).Could someone help me? Here is my current code, where I build the AVAudioPCMBuffer.func convertInt16ToFloat32(_ data: [Int16]) -> AVAudioPCMBuffer { let audioBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat!, frameCapacity: 160)! // Each real data of the array input is reduced to the interval [-1, 1] for i in 0..<data.count { // Convert the buffer to floats. (before resampling) let div: Float32 = (1.0 / 32768.0) let floatKiller = div * Float32(i) audioBuffer.floatChannelData?.pointee[i] = floatKiller } audioBuffer.frameLength = audioBuffer.frameCapacity return audioBuffer }And on the other side I play back the created AVAudioPCMBuffer in my AVAudioEngine.func playFromNetwork(data: [Int16]) { // data: linear data PCM-Int16, sample rate 8000, 160 bytes let audio = convertInt16ToFloat32(data) // playback converted data on AVAudioPlayerNode self.playerNode!.scheduleBuffer(audio, completionHandler: nil) Logger.Audio.log("Play audio data .....") }Here is my setup for AVAudioEngine:func initAudio() { try! AVAudioSession.sharedInstance().setActive(true) try! AVAudioSession.sharedInstance().setCategory(.playback) let outputFormat = AVAudioFormat.init(commonFormat: AVAudioCommonFormat.pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false) engine = AVAudioEngine() playerNode = AVAudioPlayerNode() engine!.attach(playerNode!) engine!.connect(playerNode!, to: engine!.mainMixerNode, format: outputFormat) engine!.prepare() try! engine!.start() playerNode!.play() }
Posted
by Tim8872.
Last updated
.
Post not yet marked as solved
0 Replies
2.3k Views
Hello,I want to do recording and playing back audio on my iOS device.Can someone give me a short hint, which classes are useful for this use case?I googled around this topic a lot and read a lot of different things.But I'm not really sure, what classes are the right one for me.Examples: AVAudioSinkNode, AVAudioSourceNode, AVAudioSession, AVAudioRecorder,AVAudioPlayer, AVAudioQueue etc. Maybe someone could show me a code snippet or sample project (if possible).My current state: I would try it with AVAudioRecorder and AVAudioPlayer.My Recorder class logic:if let dir = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first { let file = dir.appendingPathComponent("audiofile.wav") // setup audio session with crazy framework let format: [String: Any] = [AVSampleRateKey: 44100.0, AVNumberOfChannelsKey : 2, AVFormatIDKey: kAudioFormatLinearPCM, AVEncoderBitRateKey: 320000, AVEncoderAudioQualityKey : AVAudioQuality.high.rawValue] recorder = try AVAudioRecorder.init(url: file, settings: format) recorder!.prepareToRecord() }My PlayBack class logic:if let dir = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first { let file = dir.appendingPathComponent("audiofile.wav") // let audioSession = AVAudioSession.sharedInstance() // try audioSession.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker) player = try AVAudioPlayer(contentsOf: file.standardizedFileURL) player!.prepareToPlay() }In the iOS simulator this code works really fine, but not on my iOS (iPhone) device. The recordPermission is available and valid. But I get no sound on my device.
Posted
by Tim8872.
Last updated
.
Post marked as solved
1 Replies
1.2k Views
Hey guys, is the RTP protocol directly available on iOS?Or I have to implement it myself?Or use a third party lib.
Posted
by Tim8872.
Last updated
.
Post not yet marked as solved
5 Replies
1k Views
Hello, I want to do recording and playing back audio on my iOS device. Can someone give me a short hint, which classes are useful for this use case?I googled around this topic a lot and read a lot of different things. But I'm not really sure, what classes are the right one for me. Examples: AVAudioSinkNode, AVAudioSourceNode, AVAudioSession, AVAudioRecording, AVAudioQueue etc.Maybe someone could show me a code snippet or sample project (if possible).
Posted
by Tim8872.
Last updated
.
Post marked as solved
7 Replies
4.3k Views
Hello,I want to create a TCP sockets, which runs also in background. For example if the dispaly is switched off. Currently my socket will be instantly closed, if display is off (same if I minimize the app). What possibilities we have to create this special socket on iOS? My background: I built a special VoIP app, which holds all the time a connection to the server. There is a heartbeat mechanism which checks cyclic the connection between client and server. Many thanks in advance!
Posted
by Tim8872.
Last updated
.
Post marked as solved
8 Replies
5.2k Views
Hallo,I have one or two questions to sockets on iOS.I have to work with UDP and TCP sockets on iOS.What API oder Framework would you suggest me to solve this problem?Working with low-level BSD sockets? Use a high-level framework? Which one (there are a lot)?Use special OS classes with more support? I don't know what is the best thing for my issue.My requirements for the socket issue: - TCP and UDP needed - NO_DELAY support needed - TCP socket with long-term-connection (not only a few seconds) - To use the sockets in Swift (if possible)One more hit from me:Currently, I worked with NWConnection and DispatchQueue API.I liked this classes until now. But I find no solution for NO_DELAY issue here (which worked). let tcp = NWProtocolTCP.Options.init() tcp.noDelay = true let params = NWParameters.init(tls: nil, tcp: tcp) connection = NWConnection(to: NWEndpoint.hostPort(host: ipAddress, port: TcpSocket.PORT), using: params) That was my test to set the NO_DELAY flag for the TCP socket. Is that right?Would be fine and great to here from you.Thanks a lot!
Posted
by Tim8872.
Last updated
.