Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Post

Replies

Boosts

Views

Activity

Call cannot be disconnected due to delay observed in AudioOutputUnitStop API
We have a VOIP calling application that releases resources at the end of call. When the AudioOutputUnitStop API is invoked, it takes upto 700millisecond to return back sometimes. If we comment that API call as a test then the AudioUnitUninitialize API takes upto 700ms. Once the cleanup is done, as part of the call flow, the application invokes a BYE SIP message. Hence in cases where the API takes more than 200 ms, the Bye message is sent with that much delay and gets blocked as per the server DDOS settings. (DDOS timer will start as soon as a the UDP socket is disconnected with the client and will timeout within 200 ms and Bye request coming post that time will get blocked) We need to understand why there is a delay of more than 200ms observed sometimes while in other cases it requires less than 50ms?
2
0
230
Sep ’24
How To Play Audio Through Headphones on WatchOS 11?
I have an app that plays audio and the behaviour of it has changed in watchOS 11. I can no longer figure out how to play the audio through the headphones. To play audio I.. let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, policy: .longFormAudio, options: [] let activated = try await session.activate() if activated { // play audio } In previous versions, 'try await session.activate()` would bring up a route picker where the user could select their headphones. Now on watchOS 11 it just plays the audio out of the speaker. Maybe that's what some people want but if they do want it to play out of the headphones I can't see how I can give that option now? There's no AVRoutePickerView available on watchOS for selecting it. I've tried setting the category to .multiRoute instead of .playback and that does bring up the picker but selecting the speaker results in an error code and selecting the headphones results in it saying it cannot find my headphones (which shouldn't be the case since Apple Music on watchOS finds them). Tried overriding the output with try session.overrideOutputAudioPort(.speaker) but the compiler complains that speaker isn't available on watchOS, which is strange as if I understand correctly it's possible to play through the speaker now at least on some Apple Watches. So is this a bug or is there some way I've not found of playing audio through the headphones?
1
0
292
Sep ’24
AVAudioEngine change playback rate in real time (AVAudioUnitVarispeed is non real time)
I am using the AVAudioEngine to play back samples in an iOS game. I would like to change the play back rate of a sample in real time. When using AVAudioUnitVarispeed for chaging the play back rate it creates stutters in the game as it isn't processed in real time (as stated here:AVAudioUnitTimeEffect) The other option I found to change the rate is by using an AVAudioEnvironmentNode and change the rate of the AVAudioPlayerNode. That works without creating stutters but limits the valid values for the rate from 0.5 to 2.0 (I need higher rates then 2.0). See here: AVAudio3DMixing. Are there any other ways to play back a sample with a rate control in real time?
0
0
171
Oct ’24
AVPlayer Error in iOS 18.0
When attempting to play a video using AVPlayer on iOS 18.0, I am encountering an error that does not occur on versions earlier than 18.0. Could you please advise what might be causing this issue? Error Domain=AVFoundationErrorDomain Code=-11828 Error Domain=NSOSStatusErrorDomain Code=-12847 "(null)" This is Error code This is the URL information I retrieved using the curl command. HTTP/1.1 200 Content-Disposition: inline;filename="sample.mp4" Accept-Ranges: bytes ETag: sample.mp4 Last-Modified: Tue, 20 Jan 1970 23:52:10 GMT Expires: Mon, 07 Oct 2024 09:15:49 GMT Content-Range: bytes 0-987561/987561 Content-Type: application/octet-stream Content-Length: 987561 Date: Mon, 30 Sep 2024 09:15:49 GMT
2
0
300
Oct ’24
Apple Music Bug
Since the iOS 18 update, there's been a bug that always occurs when listening to music through Apple Music. When music is playing and the iPhone enters or exits standby mode, the music pauses by itself for 1 second. The initial conditions were the same as when this problem didn’t exist.
2
0
303
Oct ’24
Content items not updating when using MediaPlayer API for CarPlay on iOS18
We are using the MediaPlayer API to provide CarPlay support. Starting in iOS 18 we are having issues updating the content list. The initial list of items will populate on a fresh instance but soon there after an error will show up saying we are not entitled to "com.apple.mediaremote.external-artwork-validation". From that point onwards no changes we make to our MPPlayableContentDataSource are reflected in CarPlay. Even after restarting the device. While the MediaPlayer API is marked as deprecated, we are still using it to provide CarPlay support going back to iOS 10. Has anyone else run into this or have suggestions for workarounds?
3
1
443
Sep ’24
CarPlay music issues iOS 18.1 (22B5054e)
When my CarPlay connects and tries to play music it plays it through the vehicles phone speakers. If I make a phone call and then hang up it pushes the sound back to the vehicles stereo speakers. Does anyone have a fix for this because this is very annoying. Sometimes it will just switch to the phone speakers and I have to complete the process again.
2
1
313
Sep ’24
Short small starter guide for AVAudioEngine and AVAudioSession on iOS
AVAudioEngine and AVAudioSession Welcome! I will start off with the terms AVAudioEngineImpl::Initialize(NSError**). Why? I want to make those who run into this issue have to possibility to find this post through Search Engines! This is short small breakdown based on what I observed while trying to use these two Components. It's not a guide that goes into all the details. If you're trying to figure out how to fix a crash, you may can find a common way to fix it, in this post! Is it possible to use AVAudioEngine and AVAudioSession together? The answer is yes. But you will face challenges regarding it. Mostly AVAudioEngine. Whatever you're trying to do, it will take a lot of testing. I don't know how it will be with an IDE. But with just .app and iPhone it will take some testing. Or a lot of testing. Something that helped me fixing a crash was, this here: https://developer.apple.com/documentation/avfaudio/audio_engine/audio_units/using_voice_processing This example Project by Apple, uses both AVAudioEngine and AVAudioSession. How can I fix AVAudioEngineImpl::Initialize(NSError**) ? I think this depends. If you're lucky and have a crash log, you may can find clues, but the stack trace sometimes doesn't really help either. I will mention common cases that I encountered though. inputNode https://developer.apple.com/documentation/avfaudio/avaudioengine/1386063-inputnode You need an inputNode apparently. You need to access it or else I think there won't be one. And if there isn't one, AVAudioEngine.start will most likely crash. The audio engine creates a singleton on demand when first accessing this variable. Doing this has prevented this common issue for me. .prepare deallocates and can cause a crash if you restart your AudioEngine Another issue I faced was handling .prepare wrong. You don't need .prepare. But if you use installTap or other things, I think you need it. Here is a common thing to note. If you had previous initialized inputNode. Those could be gone after using .prepare. You have to ensure you're accessing AVAudioEngine.inputNode again before calling .start() or whatever node you need. The Voice Processing Project, does this by creating a Managing Controller for AVAudioEngine with a sort of "setup" function, which ensures that everything is ready, before .prepare and .start get called. AVAudioSession's setCategory You have to experiment with it. The crashes can be very weird. Sometimes your App will only crash once, and then only after you install it again, or if you start it up. You are actually able to use .setActive and .setCategory with AVAduioEngine. Just do not try to do .setActive(false) before you've stopped the AudioEngine, as it will fail. Sometimes I'd run into an issue with .setActive(true) so you really have to experiment if leaving that part out resolves the issue or not. try session.setCategory(.multiRoute, mode: .default, options: [.defaultToSpeaker, .mixWithOthers]) Experiment with it. But these .multiRoute and .mixWithOthers have allowed me to use AVAudioEngine to make a test recording. And I can even switch the Data Sources and Polar Patterns without any issues. Sometimes you can get away without setting .setActive at all. Not sure if AVAudioEngine does it automatically. Short Summary If you use .prepare and then .stop, make sure to initialize things like .inputNode before calling .prepare and .start again. (THIS CAN BE DIFFERENT) Only call .setActive(false) after you used .stop. Otherwise I believe it has no chance to stop it. AVAudioSession setCategory is important. Ensure you use mixRoutes or experiment with all the modes. If you manage to solve your crash, you'll be able to indeed change the Data Sources and Polar Patterns and more! Use isRunning before using .start, this will save you from another crash. If you use .start while it's already running, I think try and catch won't save you here, you have to ensure you're not starting it twice. I hope that this short breakdown will help you to resolve your crash. If you get deeper into AVAudioEngine and AVAudioSession, you'll probably face more crashes. I yet, need to figure out how to solve them. I have a lot of trouble to put my Testing App on my iPhone, so I am sorry if this guide didn't cover every detail of it. A HUGE tip from me is to check the Documentations. As example, when I read the Documentation for inputNode I learned why my app crashed, it's because I never accessed and initialized one. The Developer Documentation can be a little bit of a laberynth, and I strongly recommend you to read every property you try to access if you believe they cause issues. And I also recommend to find example Projects like the Voice Processing ones. As there aren't any Code Examples in the Documentation.
0
0
322
Sep ’24
AVAudioPlayer init very slow on iOS 18
On Xcode 16 (16A242) app execution and UI will stall / lag as soon as an AVAudioPlayer is initialized. let audioPlayer = try AVAudioPlayer(contentsOf: URL) audioPlayer.volume = 1.0 audioPlayer.delegate = self audioPlayer.prepareToPlay() Typically you would not notice this in a music app for example, but it is especially noticable in games where multiple sounds are being played using multiple instances of AVAudioPlayer. The entire app slows down because of it. This is similar to this issue from last year. I have reported it to Apple in FB15144369, as this messes up my production games where fps goes down to nothing when sounds are enabled. Unfortunately I cannot find a solution. Anyone?
2
0
326
Sep ’24
How To Add Multiple Songs/Playlist to the Queue?
A couple of weeks ago I got help here to play one song and the solution to my problem was that I wasn't adding the song (Track Type) to the queue correctly, so now I want to be able to add a playlist worth of songs to the queue. The problem is when I try to add an array of the Track type I get an error. The other part of this issue for me is how do I access an individual song off of the queue after I add it? I see I can do ApplicationMusicPlayer.shared.queue.currentItem but I think I'm missing/misunderstanding something here. Anyway's I'll post the code I have to show how I'm attempting to do this at this moment. In this scenario we're getting passed in a playlist from another view. import SwiftUI import MusicKit struct PlayBackView: View { @State var song: Track? @State private var songs: [Track] = [] @State var playlist: Playlist private let player = ApplicationMusicPlayer.shared VStack { // Album Cover HStack(spacing: 20) { if let artwork = player.queue.currentEntry?.artwork { ArtworkImage(artwork, height: 100) } else { Image(systemName: "music.note") .resizable() .frame(width: 100, height: 100) } VStack(alignment: .leading) { // Song Title Text(player.queue.currentEntry?.title ?? "Song Title Not Found") .font(.title) .fixedSize(horizontal: false, vertical: true) } } } .padding() .task { await loadTracks() // It's Here I thought I could do something like this player.queue = tracks // Since I can do this with one singular track player.queue = [song] do { try await player.queue.insert(songs, position: .afterCurrentEntry) } catch { print(error.localizedDescription) } } } @MainActor private func loadTracks() async { do { let detailedPlaylist = try await playlist.with([.tracks]) let tracks = detailedPlaylist.tracks ?? [] setTracks(tracks) } catch { print(error.localizedDescription) } } @MainActor private func setTracks(_ tracks: MusicItemCollection<Track>) { songs = Array(tracks) } }
1
0
337
Sep ’24
Bluetooth Speaker makes installTap fail to callback after first few seconds
If I have bluetooth speaker connected and I have installTap called on input Node, the callback is fired for 1-2 seconds then it doesnt anymore. I dont see any route or any notification handler called in between. engine.inputNode.removeTap(onBus: 0) engine.inputNode.installTap( onBus: 0, bufferSize: 4096, format: format ) { buffer, _ in // 3 guard let channelData = buffer.floatChannelData else { return } // This callback fails after some time. } Not sure if this is expected, but I noticed some other applications, they seem to work fine. If I remove bluetooth device, my input works fine. Also I have no issues with output on Speaker.
2
0
269
Sep ’24
iOS 18.0 bug - initial AVAudioSession.outputVolume returns zero
Is anyone experiencing an issue with initial ‘AVAudioSession.sharedInstance().outputVolume’ returning 0 after updating to iOS 18.0? I’m observing the outputVolume of AVAudioSession, and when I adjust the device volume, the change value returns correctly. However, when I call AVAudioSession.sharedInstance().outputVolume to get the current volume before knowing the change value, it returns 0, even though the device volume is not actually 0.
1
1
293
Sep ’24
io buffer sizes for audio driver based on IOUserAudioDevice
Dear Sirs, I've written an audio driver based on IOUserAudioDevice. In my IOOperationHandler I can receive and send the audio samples as expected. Is there any way to configure the number of samples transferred in each call? Currently it seem to be around 512 samples per call, which relates to 10.7 millisecs when operating on 48 kHz samplerate. I'd like to achieve something like 48 or 96 samples per call. I did some experiments and tried calls to SetOutputLatency() etc. but so far I didn't find the right way to change the in_io_buffer_frame_size in the callback. I'd like to do this as smaller buffer sizes would allow lower latencies for the subsequent audio processing. Thanks and best regards, Johannes
3
0
488
Jul ’24
MusicKit & React Native app, overlay part of a song to a video?
I have an app on which users learn choreography. To avoid copyright infringements we currently only have audio instructions and no music on the app. Could we enable those that are subscribed to Apple Music to listen to the part of a song the corresponds to the choreography? Usually they are 60 seconds long. The app is in React Native. Would it be possible to implement it so that opening a dance video automatically triggers the playback of that song from e.g. second 32 - 95? Since the video is looping, could it then start playing from second 32 again? Also looking for devs with experience in integrating the MusicKit for this usecase if it turns out to be possible.
0
0
239
Sep ’24
ExtAudioFileRead throwing AVAudioSessionErrorCodeResourceNotAvailable error on iOS and iPadOS 18
Calls to ExtAudioFileRead are throwing OSStatus 561145203 (AVAudioSessionErrorCodeResourceNotAvailable) on iOS and iPadOS 18 -- earlier versions of iOS have not exhibited this behavior. This is a longstanding code path that has seen a spike of these error codes since iOS 18's release. The following is also printed to the Xcode 16 console:
2
1
375
Sep ’24
Audio / Video sync issue on iOS using AVSampleBufferRenderSynchronizer
My current app implements a custom video player, based on a AVSampleBufferRenderSynchronizer synchronising two renderers: an AVSampleBufferDisplayLayer receiving decoded CVPixelBuffer-based video CMSampleBuffers, and an AVSampleBufferAudioRenderer receiving decoded lpcm-based audio CMSampleBuffers. The AVSampleBufferRenderSynchronizer is started when the first image (in presentation order) is decoded and enqueued, using avSynchronizer.setRate(_ rate: Float, time: CMTime), with rate = 1 and time the presentation timestamp of the first decoded image. Presentation timestamps of video and audio sample buffers are consistent, and on most streams, the audio and video are correctly synchronized. However on some network streams, on iOS, the audio and video aren't synchronized, with a time difference that seems to increase with time. On the other hand, with the same player code and network streams on macOS, the synchronization always works fine. This reminds me of something I've read, about cases where an AVSampleBufferRenderSynchronizer could not synchronize audio and video, causing them to run with independent and potentially drifting clocks, but I cannot find it again. So, any help / hints on this sync problem will be greatly appreciated! :)
1
0
841
May ’23