Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Post

Replies

Boosts

Views

Activity

AVPlayer with multiple audio tracks plays audio differently when start
Hi, I'm trying to play multiple video/audio file with AVPlayer using AVMutableComposition. Each video/audio file can process simultaneously so I set each video/audio in individual tracks. I use only local file. let second = CMTime(seconds: 1, preferredTimescale: 1000) let duration = CMTimeRange(start: .zero, duration: second) var currentTime = CMTime.zero for _ in 0...4 { let mutableTrack = composition.addMutableTrack( withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid ) try mutableTrack?.insertTimeRange( duration, of: audioAssetTrack, at: currentTime ) currentTime = currentTime + second } When I set many audio tracks (maybe more than 5), the first part sounds a little different from original when it starts. It seems like audio's front part is skipped. But when I set only two tracks, AVPlayer plays as same as original file. avPlayer.play() How can I fix it? Why do audio tracks affect that don't have any playing parts when start? Please let me know.
1
2
1.1k
Dec ’23
AudioComponentInstanceNew crash on ios17 when use address sanitizer
I tried the same code on ios17 and ios16 when enable address sanitizer, ios17 will crash, why? Can anyone help me? AudioComponent comp = {0}; AudioComponentDescription compDesc = {0}; compDesc.componentType = kAudioUnitType_Output; compDesc.componentSubType = kAudioUnitSubType_RemoteIO; compDesc.componentManufacturer = kAudioUnitManufacturer_Apple; compDesc.componentFlags = 0; compDesc.componentFlagsMask = 0; comp = AudioComponentFindNext(NULL, &compDesc); if (comp == NULL) { assert(false); } AudioUnit tempAudioUnit; osResult = AudioComponentInstanceNew(comp, &tempAudioUnit); if (osResult != noErr) { assert(false); }
2
0
843
Dec ’23
I can't add more than 2 thousand and a few CMSAMPLEBUFFER in AVAssetWrite
Hello, I am deaf-blind and I program with a braille display. Currently, I am experiencing a difficulty with one of my APPs. Basically, I'm converting AVAudioPCMBuffer for CMSampleBuffer and so far so good. I want to add several CMSampleBuffer in a video written with AVAssetWrite . The problem is that I can only add up to more or less 2 thousands CMSampleBuffer . I'm trying to create a video. In this video, I put photos that are in an array and then I put audio from CMSampleBuffer. But I can't add many CMSampleBuffer and only goes up to 2 thousand something. I do not know what else to do. Help me. Below is a small excerpt of the code: let queue = DispatchQueue(label: "AssetWriterQueue") let audioProvider = SampleProvider(buffers: audioBuffers) let videoProvider = SampleProvider(buffers: videoBuffers) let audioInput = createAudioInput(audioBuffers: audioBuffers) let videoInput = createVideoInput(videoBuffers: videoBuffers) let adaptor = createPixelBufferAdaptor(videoInput: videoInput) let assetWriter = try AVAssetWriter(outputURL: url, fileType: .mp4) assetWriter.add(videoInput) assetWriter.add(audioInput) assetWriter.startWriting() assetWriter.startSession(atSourceTime: .zero) await withCheckedContinuation { continuation in videoInput.requestMediaDataWhenReady(on: queue) { let time = videoProvider.getPresentationTime() if let buffer = videoProvider.getNextBuffer() { adaptor.append(buffer, withPresentationTime: time) } else { videoInput.markAsFinished() continuation.resume() } } } await withCheckedContinuation { continuation in audioInput.requestMediaDataWhenReady(on: queue) { if let buffer = audioProvider.getNextBuffer() { audioInput.append(buffer) } else { audioInput.markAsFinished() continuation.resume() } } }
0
0
442
Dec ’23
Display jitter with "ProMotion"
Our DJ application Mixxx renders scrolling waveforms with 60 Hz. This looks perfectly smooth on an older 2015 MacBook Pro. However it looks jittery on a new M1 device with "ProMotion" enabled. Selecting 60 Hz fixes the issue. We are looking for a way to tell macOS that it can expect 60 Hz renderings from Mixxx and must not display them early (at 120 Hz) even if the pictures are ready. The alternative would be to read out the display settings and ask the user to select 60 Hz. Is there an API to: hint the display diver that we render with 60 Hz read out the refresh rate settings?
0
1
679
Dec ’23
App crashes: CA::Render::InterpolatedFunction::encode(CA::Render::Encoder*)
I've started seeing several users getting an app crash that I've been unable to find the root cause for so far. I've tried running the app in release build with address sanitizer and zombie objects checks enabled but have been unable to reproduce it. It only occurs for iOS 17 users. Any ideas on how I can troubleshoot this? Crashed: com.apple.main-thread EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000 Crashed: com.apple.main-thread 0 libsystem_platform.dylib 0xed4 _platform_memmove + 52 1 QuartzCore 0x137120 CA::Render::InterpolatedFunction::encode(CA::Render::Encoder*) const + 248 2 QuartzCore 0x136f40 CA::Render::GradientLayer::encode(CA::Render::Encoder*) const + 44 3 QuartzCore 0x2e384 CA::Render::Layer::encode(CA::Render::Encoder*) const + 284 4 QuartzCore 0x2e224 CA::Render::encode_set_object(CA::Render::Encoder*, unsigned long, unsigned int, CA::Render::Object*, unsigned int) + 196 5 QuartzCore 0x2b654 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 244 6 QuartzCore 0x2b4fc CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368 7 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 8 QuartzCore 0x2b4bc CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 304 9 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 10 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 11 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 12 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 13 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 14 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 15 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 16 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 17 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 18 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 19 QuartzCore 0x6fc60 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11192 20 QuartzCore 0x66574 CA::Transaction::commit() + 648 21 UIKitCore 0x31b5ec __34-[UIApplication _firstCommitBlock]_block_invoke_2 + 36 22 CoreFoundation 0x373a8 __CFRUNLOOP_IS_CALLING_OUT_TO_A_BLOCK__ + 28 23 CoreFoundation 0x35b9c __CFRunLoopDoBlocks + 356 24 CoreFoundation 0x33a9c __CFRunLoopRun + 848 25 CoreFoundation 0x33668 CFRunLoopRunSpecific + 608 26 GraphicsServices 0x35ec GSEventRunModal + 164 27 UIKitCore 0x22c2b4 -[UIApplication _run] + 888 28 UIKitCore 0x22b8f0 UIApplicationMain + 340 29 Coach 0x799d8 main + 14 (main.m:14) 30 ??? 0x1abefadcc (Missing)
11
8
3.1k
Dec ’23
How Can I Access The Secondary MV-HEVC Frame
I’m working with the Spatial Video related APIs in AVFoundation, and while I can create an AVAssetReader that reads an AVAssetTrack that reports a .containsStereoMultiviewVideo media characteristic (on a spatial video recorded by an iPhone 15 Pro), the documentation doesn’t make it clear how I can obtain the secondary video frame from that track. Does anyone know where to look? I've scoured the forums, documentation, and other resources, and I've had no luck. Thanks!
4
0
1.1k
Dec ’23
Avoiding microphone permission popup on macOS Sonoma
I am working on an app that uses Core Audio through JUCE library for audio. The problem I'm trying to solve is that when the app is using a full duplex audio interface such as one from Focusrite Scarlett series for output, the app shows a dialog requesting permission to use microphone. The root cause of the issue is that by default, Core Audio opens full duplex devices for both input and output. On previous macOS versions, I was able to work around the problem by disabling the input stream before starting the IOProc by setting AudioHardwareIOProcStreamUsage to all zero for input. On macOS Sonoma this disables input so that the microphone indicator is not shown, but the permission popup is still shown. What other reasons there are to show the popup? I have noticed that Chrome and Slack have the same problem that they show the microphone popup when trying to play sounds on the Focusrite, but for example Deezer manages without the popup.
2
0
1.1k
Dec ’23
Library to identify musical note
Developing for iphone/ipad/mac I have an idea for a music training app, but need to know of supporting libraries for recognizing a musical note's fundamental frequency in close to real time (100 ms delay) Accuracy should be within a few cents (hundredths of a semi tone) A search for "music" resolved the core-midi library -- fine if I want to take input from midi, but I want to be open to audio input too. And I found MusicKit, which seems to be a programmer's API for digging into Meta questions: Should I be using different search terms: Where are libraries listed? Who are the names in 3rd party libraries.
0
0
653
Dec ’23
AVPlayer can't resume after stalling
I have repeatedly checked if you limit the connection speed to a host with a video file (mp4), it brings that AVPlayer is stalled, but after you return the high speed connection to the host, the player does not resume playback. If you check the status, no errors, just the empty buffer: AVPlayer.error is nil. AVPlayerItem.error is nil. AVPlayerItem.isPlaybackBufferEmpty is true AVPlayerItem.isPlaybackLikelyToKeepUp is false Even if you try to wait a lot of time nothing happens and tapping play button it doesn't help as well. The player is frozen forever. Only if you call "seek" or call "playImmediately" method the player is unfrozen and resume playback. It happens not all the time, maybe one time from four. It seems like AVPlayer has a bug. What do you think?
0
0
543
Dec ’23
ClassInfo Audio Unit Property not being set
I have a music player that is able to save and restore AU parameters using the kAudioUnitProperty_ClassInfo property. For non apple AUs, this works fine. But for any of the Apple units, the class info can be set only the first time after the audio graph is built. Subsequent sets of the property do not stick even though the OSStatus code is 0 upon return. Previously this had worked fine. But sometime, not sure when, the Apple provided AUs changed their behavior and is now causing me problems. Can anyone help shed light on this ? Thanks in advance for the help. Jeff Frey
0
0
715
Jan ’24
Application Crashed: com.apple.main-thread EXC_BAD_ACCESS KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 76
Application Crashed: com.apple.main-thread EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x000000000000001e Crashed: com.apple.main-thread 0 libobjc.A.dylib 0x2df58 object_isClass + 16 1 Foundation 0x1c9bc KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 76 2 Foundation 0x1bd60 NSKeyValueWillChangeWithPerThreadPendingNotifications + 300 3 AVFoundation 0x1380 -[AVPlayerAccessibility willChangeValueForKey:] + 72 4 AVFCore 0x13954 -[AVPlayer _noteNewPresentationSizeForPlayerItem:] + 48 5 AVFCore 0x1fbb0 __avplayeritem_fpItemNotificationCallback_block_invoke + 4336 6 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32 7 libdispatch.dylib 0x4300 _dispatch_client_callout + 20 8 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984 9 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44 10 CoreFoundation 0x3701c CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE + 16 11 CoreFoundation 0x33d28 __CFRunLoopRun + 1996 12 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608 13 GraphicsServices 0x34f8 GSEventRunModal + 164 14 UIKitCore 0x22c62c -[UIApplication _run] + 888 15 UIKitCore 0x22bc68 UIApplicationMain + 340 16 UIKitCore 0x4563d0 __swift_destroy_boxed_opaque_existential_1Tm + 12220 17 AajTak 0x84c4 main + 4333552836 (QuizLeaderboardViewModel.swift:4333552836) com.livingMedia.AajTakiPhone_issue_4e4b5f148b75496175c3900a1405bd62_crash_session_3ff23a3e8e854c4ab68de2789fe76c5b_DNE_0_v2_stacktrace.txt
0
0
596
Jan ’24
selecting a specific audio output device
there is a method setPreferredInput in AVAudioSession which can be used to select different input device. But, does there any similar function like "setPerferredOutput" so that in my APP I can select a specific audio output device to play audio ? I do not want user to change it through system interfaces (such as the Control Center), but by logic inside APP. thanks!
0
0
460
Jan ’24
Play Music While Camera Is Open
I am creating a camera app where I would like music from another app (Apple Music, Spotify, etc.) to continue playing once the app is opened. Currently I am using .mixWithOthers to do this in my viewDidLoad. let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(AVAudioSession.Category.playback, options: [.mixWithOthers]) try audioSession.setActive(true) } catch { print("error trying to record and play audio") } However I am running into an issue where the music only plays if you resume music playback once you start recording a video. Otherwise, when you open the app music will stop when you see the preview. The interesting thing is that if you start playing music while recording, then once you stop music continues to play in the preview view. If you close the app (not force close) and reopen then music play back continues as expected. However, once you force close the app then it returns to the original behavior. I've tried to do research on this and I have not been able to find anything. Any help is appreciated. Let me know if more details are needed.
1
0
882
Jan ’24
Logic Pro does not update AUAudioUnit's contextName for MIDI FXs
This can be reproduced easily with XCode's generated AUv3-Extension Projects. For MIDI Processor type AUv3-Extensions, the contextName property is only set once during initializing when added as a MIDI FX within Logic Pro, but not after changing the track's name manually. For Music Effect type AUv3-Extensions, contextName is set initially when added as an Audio FX within Logic Pro as well as updated as expected after changing the tracks's name manually. Am I missing something or is this a Logic Pro bug? Thanks, Tobias
0
1
602
Jan ’24
Audio Drops Out When Setting Category to .playAndRecord
I am creating an app where you can record a video and listen to music in the background. At the top of my viewDidLoad I set the AVAudioSession Category to .playAndRecord let audioSession = AVAudioSession.sharedInstance() AVCaptureSession().automaticallyConfiguresApplicationAudioSession = false do { try audioSession.setCategory(AVAudioSession.Category.playAndRecord, options: [.mixWithOthers, .allowAirPlay, .allowBluetoothA2DP]) try audioSession.setActive(true) } catch { print("error trying to record and play audio") } However when I do this the audio cuts out for a second or less at app open and app close. I would like the audio to continue playing and not cutout. Is there anything I can do to ensure the audio continues to play?
0
0
448
Jan ’24
Apple Music preview question
I have a question about the Apple Music preview app for Windows 11. It has a setting called Sound Check. Is that feature available on the Apple Music web player and the Apple Music Android app? If not, is that a planned feature for those?
0
0
464
Jan ’24