Integrate music and other audio content into your apps.

Posts under Audio tag

80 Posts
Sort by:
Post not yet marked as solved
2 Replies
790 Views
Hello, I started to set audio stereo recording (both audio and video are recorded) and the audio quality seems to be lower than quality obtained with native camera application (configured for stereo). Using console to check the log, I found a difference between camera app and mine regarding MXSessionMode (of mediaserverd) in fact, camera application gives MXSessionMode = SpatialRecording and mine MXSessionMode = VideoRecording How can I configure capture session to finally have MXSessionMode = SpatialRecording? Any suggestion? Best regards
Posted
by ftristani.
Last updated
.
Post not yet marked as solved
9 Replies
3.6k Views
I am getting an error in iOS 16. This error doesn't appear in previous iOS versions. I am using RemoteIO to playback live audio at 4000 hz. The error is the following: Input data proc returned inconsistent 2 packets for 186 bytes; at 2 bytes per packet, that is actually 93 packets This is how the audio format and the callback is set: // Set the Audio format AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 4000; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; audioFormat.mFramesPerPacket = 1; audioFormat.mChannelsPerFrame = 1; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = 2; audioFormat.mBytesPerFrame = 2; AURenderCallbackStruct callbackStruct; // Set output callback callbackStruct.inputProc = playbackCallback; callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self); status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &callbackStruct, sizeof(callbackStruct)); Note that the mSampleRate I set is 4000 Hz. In iOS 15 I get 0.02322 seconds of buffer duration (IOBufferDuration) and 93 frames in each callback. This is expected, because: number of frames * buffer duration = sampling rate 93 * 0.02322 = 4000 Hz However, in iOS 16 I am getting the aforementioned error in the callback. Input data proc returned inconsistent 2 packets for 186 bytes; at 2 bytes per packet, that is actually 93 packets Since the number of frames is equal to the number of packets, I am getting 1 or 2 frames in the callback and the buffer duration is of 0.02322 seconds. This didn't affect the playback of the "raw" signal, but it did affect the playback of the "processed" signal. number of frames * buffer duration = sampling rate 2 * 0.02322 = 0.046 Hz That doesn't make any sense. This error appears for different sampling rates (8000, 16000, 32000), but not for 44100. However I would like to keep 4000 as my sampling rate. I have also tried to set the sampling rate by using the setPreferredSampleRate(_:) function of AVAudioSession, but the attempt didn't succeed. The sampling rate was still 44100 after calling that function. Any help on this issue would be appreciated.
Posted
by ndrg.
Last updated
.
Post marked as solved
1 Replies
788 Views
Hello everyone, I'm using Flutter and the just_audio package. When a user receives a push notification, the app plays audio in the background. I've tested this functionality on iPhone 6s and iPhone 13. It works correctly on iPhone 6s and the app plays the sound on push notification received. However on iPhone 13 the app receives the notification, starts the background process but fails to play the sound with these errors: mediaserverd(MediaExperience)[17680] <Notice>: -CMSUtilities- CMSUtility_IsAllowedToStartPlaying: Client sid:0x45107e5, Runner(28933), 'prim' with category MediaPlayback and mode Default and mixable does not have assertions to start mixable playback mediaserverd(MediaExperience)[17680] <Notice>: -CMSessionMgr- MXCoreSessionBeginInterruption_WithSecTaskAndFlags: CMSessionBeginInterruption failed as client 'sid:0x45107e5, Runner(28933), 'prim'' has insufficient privileges to take control mediaserverd(AudioSessionServer)[17680] <Error>: AudioSessionServerImp.mm:405 { "action":"cm_session_begin_interruption", "error":"translating CM session error", "session":{"ID":"0x45107e5","name":"Runner(28933)"}, "details":{"calling_line":879,"error_code":-16980,"error_string":"Operation denied. Cannot start playing"} } From what I understand of these errors is that on the newer iPhones, there must be additional permissions. Does anyone have any idea on how I can fix this?
Posted
by gotiobg.
Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
We noticed iOS 16 doesn't seem to support these commands anymore: MPRemoteCommandCenter.shared().likeCommand MPRemoteCommandCenter.shared().dislikeCommand MPRemoteCommandCenter.shared().bookmarkCommand Or is there another way to show a menu in lieu of the previous button on the lock screen?
Posted
by chival.
Last updated
.
Post not yet marked as solved
1 Replies
434 Views
Translated Report (Full Report Below) Process: Logic Pro X [1524] Path: /Applications/Logic Pro X.app/Contents/MacOS/Logic Pro X Identifier: com.apple.logic10 Version: 10.7.7 (5762) Build Info: MALogic-5762000000000000~2 (1A85) App Item ID: 634148309 App External ID: 854029738 Code Type: ARM-64 (Native) Parent Process: launchd [1] User ID: 502 Date/Time: 2023-10-10 12:52:02.8675 +0100 OS Version: macOS 13.0 (22A380) Report Version: 12 Anonymous UUID: D3A4AE8C-2CA2-CC80-A569-39459CA10192 Time Awake Since Boot: 4400 seconds System Integrity Protection: enabled Crashed Thread: 0 Dispatch queue: com.apple.main-thread Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [1524] VM Region Info: 0x10 is not in any region. Bytes before following region: 105553518919664 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> MALLOC_NANO (reserved) 600018000000-600020000000 [128.0M] rw-/rwx SM=NUL ...(unallocated) Thread 0 Crashed:: Dispatch queue: com.apple.main-thread 0 Logic Pro X 0x1045c2794 0x10412c000 + 4810644 1 Logic Pro X 0x1045c273c 0x10412c000 + 4810556 2 Logic Pro X 0x1045c37c8 0x10412c000 + 4814792 3 Logic Pro X 0x10492bc40 0x10412c000 + 8387648 4 Logic Pro X 0x10492bd30 0x10412c000 + 8387888 5 Logic Pro X 0x1045c23d0 0x10412c000 + 4809680 6 Logic Pro X 0x1049838ac 0x10412c000 + 8747180 7 Logic Pro X 0x10498340c 0x10412c000 + 8745996 8 Logic Pro X 0x1044feb18 0x10412c000 + 4008728 9 CoreFoundation 0x191222fd0 NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK + 24 10 CoreFoundation 0x19125f4b4 -[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:] + 212 11 Logic Pro X 0x104ae37b4 0x10412c000 + 10188724 12 Logic Pro X 0x1044fe7ac 0x10412c000 + 4007852 13 Logic Pro X 0x104ae9eb8 0x10412c000 + 10215096 14 Logic Pro X 0x10498987c 0x10412c000 + 8771708 15 Logic Pro X 0x104989434 0x10412c000 + 8770612 16 Logic Pro X 0x10521497c 0x10412c000 + 17729916 17 Foundation 0x19219c67c __NSFireTimer + 104 18 CoreFoundation 0x191277578 CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION + 32 19 CoreFoundation 0x191277220 __CFRunLoopDoTimer + 940 20 CoreFoundation 0x191276d78 __CFRunLoopDoTimers + 356 21 CoreFoundation 0x19125c760 __CFRunLoopRun + 1896 22 CoreFoundation 0x19125b8a4 CFRunLoopRunSpecific + 612 23 HIToolbox 0x19a8cf3bc RunCurrentEventLoopInMode + 292 24 HIToolbox 0x19a8cf200 ReceiveNextEventCommon + 672 25 HIToolbox 0x19a8cef48 _BlockUntilNextEventMatchingListInModeWithFilter + 72 26 AppKit 0x1944b4630 _DPSNextEvent + 632 27 AppKit 0x1944b37c0 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 728 28 Logic Pro X 0x1055198b8 0x10412c000 + 20895928 29 AppKit 0x1944a7bf0 -[NSApplication run] + 464 30 AppKit 0x19447f058 NSApplicationMain + 880 31 Logic Pro X 0x104a6a7a8 0x10412c000 + 9693096 32 dyld 0x190e53e50 start + 2544 Thread 1:: caulk.messenger.shared:17 0 libsystem_kernel.dylib 0x19113ed6c semaphore_wait_trap + 8 1 caulk 0x19a5f6cfc caulk::mach::semaphore::wait_or_error() + 28 2 caulk 0x19a5d9634 caulk::concurrent::details::worker_thread::run() + 56 3 caulk 0x19a5d9278 void* caulk::thread_proxy<std::__1::tuple<caulk::thread::attributes, void (caulk::concurrent::details::worker_thread::)(), std::__1::tuplecaulk::concurrent::details::worker_thread* > >(void) + 96 4 libsystem_pthread.dylib 0x19117e06c _pthread_start + 148 5 libsystem_pthread.dylib 0x191178e2c thread_start + 8 Thread 2:: com.apple.NSEventThread 0 libsystem_kernel.dylib 0x19113edf0 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1911508d8 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x191147638 mach_msg_overwrite + 540 3 libsystem_kernel.dylib 0x19113f16c mach_msg + 24 4 CoreFoundation 0x19125dbdc __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19125c4c8 __CFRunLoopRun + 1232 6 CoreFoundation 0x19125b8a4 CFRunLoopRunSpecific + 612 7 AppKit 0x1945de248 _NSEventThread + 172 8 libsystem_pthread.dylib 0x19117e06c _pthread_start + 148 9 libsystem_pthread.dylib 0x191178e2c thread_start + 8 Thread 3:: MIDIClientNotificationThread 0 libsystem_kernel.dylib 0x19113edf0 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1911508d8 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x191147638 mach_msg_overwrite + 540 3 libsystem_kernel.dylib 0x19113f16c mach_msg + 24 4 CoreFoundation 0x19125dbdc __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19125c4c8 __CFRunLoopRun + 1232 6 CoreFoundation 0x19125b8a4 CFRunLoopRunSpecific + 612 7 Foundation 0x192163e58 -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212 8 Foundation 0x1921d83b4 -[NSRunLoop(NSRunLoop) runUntilDate:] + 100 9 Logic Pro X 0x1045a8a74 0x10412c000 + 4704884 10 libsystem_pthread.dylib 0x19117e06c _pthread_start + 148 11 libsystem_pthread.dylib 0x191178e2c thread_start + 8 Thread 4:: SeqTimer
Posted Last updated
.
Post not yet marked as solved
0 Replies
537 Views
I'm developing a macOS app and I'm trying to access the microphone without directly triggering the default permission dialog. Instead, I've managed to programmatically open the System Settings, specifically the Privacy &amp;amp; Security -&amp;gt; Microphone section, allowing users to manually grant permission. However, there's an issue. Even after the user manually toggles on the microphone permission for my app in System Settings, the AVCaptureDevice.authorizationStatus(for: .audio) still returns .notDetermined. To clarify, I'm avoiding the use of AVCaptureDevice.requestAccess(for: .audio) because it prompts the default permission dialog. But when I do use it, the app correctly recognizes changes in permission status. The problem arises only when trying to detect permission changes made directly from the System Settings. Here is my code struct SystemSettingsHandler { static func openSystemSetting(for type: String) { guard type == "microphone" || type == "screen" else { return } let microphoneURL = "x-apple.systempreferences:com.apple.preference.security?Privacy_Microphone" let screenURL = "x-apple.systempreferences:com.apple.preference.security?Privacy_ScreenCapture" let urlString = type == "microphone" ? microphoneURL : screenURL if let url = URL(string: urlString) { NSWorkspace.shared.open(url) } } } private func requestMicrophonePermission(completion: @escaping (Bool) -&amp;gt; Void) { switch AVCaptureDevice.authorizationStatus(for: .audio) { case .authorized: print("authorized") completion(true) case .notDetermined: print("notDetermined") AVCaptureDevice.requestAccess(for: .audio) { granted in if granted { completion(granted) } else { completion(granted) } } case .denied, .restricted: print("denied") SystemSettingsHandler.openSystemSetting(for: "microphone") completion(false) @unknown default: print("unknown") completion(false) } } Thank you for reading this post!
Posted Last updated
.
Post not yet marked as solved
1 Replies
510 Views
I've been using AVAssetExportSession to trim audio files for the past 2 years, and suddenly it stopped working properly. It still works fine when I run my app on a phone running iOS 16, but on my iOS 17 phone it exports an incorrect duration (ex. I'll provide a file with 2 seconds duration, ask it to trim it to 0 - 1.7s, it'll return the file overtrimmed at 1.58s or something like that). The AVURLAsset is returning the correct duration, I've already tried using the AVURLAssetPreferPreciseDurationAndTimingKey, it's useless to me, as the error happens somewhere during the export. guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(false, nil) return } let startTime = CMTimeMakeWithSeconds(floor(startPoint * 100) / 100.0, preferredTimescale: 44100) let stopTime = CMTimeMakeWithSeconds(ceil(endPoint * 100) / 100.0, preferredTimescale: 44100) let exportTimeRange = CMTimeRange(start: startTime, end: stopTime) exportSession.timeRange = exportTimeRange exportSession.outputFileType = .m4a exportSession.outputURL = targetURL AudioHelper.deleteFile(at: exportSession.outputURL) exportSession.exportAsynchronously { ... } I've managed to somewhat mitigate the damage by adding silence to the file and continuously trimming it until I get it close to my required duration, but it's an extremely ugly hack and it's breaking down the whole functionality of my app.
Posted Last updated
.
Post not yet marked as solved
6 Replies
2k Views
Turn on address sanitizer on Xcode and use a real device and put a Test.mp3 file in the Xcode project. Then it will crash when you initialise a AVAudioPlayer with a mp3 file (with a wav file it works fine). I have made an entry in feedback assistent -> FB12425453. var player : AVAudioPlayer? func playSound() { if let url = Bundle.main.url(forResource: "Test", withExtension: "mp3") { self.player = try? AVAudioPlayer(contentsOf: url) // --> deallocation of non allocated memory problem --> with a "wav" file it works .... } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
540 Views
How can i record an audio when the app is in background? I tried on android and its working but on IOS its only recording on the foreground Its an Expo app and am using Expo Av library I have already allowed the permissions and set the UIBackgroundModes with audio "infoPlist": { ... "UIBackgroundModes": [ "audio" ] } And in the code also await Audio.setAudioModeAsync({ allowsRecordingIOS: true, playsInSilentModeIOS: true, staysActiveInBackground:true }); but once the app is in background mode it is failing to start recording. Can anyone help me how I can fix this ?
Posted
by Santinal.
Last updated
.
Post not yet marked as solved
0 Replies
505 Views
I am writing a watchOS app where I have some audio files that I want to play at various points. I am using AVAudioPlayer. It all works in the simulator and it also works if I have Air Pods connected to my watch via Bluetooth. However I get no sound if there isn't a paired set of earphones. In the case of no earphones I would like the sounds to play from the physical watch speaker. I can't seem to find any documentation on how to cause that to happen. Any hints or tips are appreciated.
Posted
by rsaccone.
Last updated
.
Post not yet marked as solved
1 Replies
549 Views
Hello, I have struggled to resolve issue above question. I could speak utterance when I turn on my iPhone, but when my iPhone goes to background mode(turn off iPhone), It doesn't speak any more. I think it is possible to play audio or speak utterance because I can play music on background status in youtube. Any help please??
Posted
by godtaehee.
Last updated
.
Post not yet marked as solved
1 Replies
731 Views
I'm developing webview app with using javascript with IONIC. When I try to add worklet module, It looks fine but after add module, and then when I try to connect audioWorkletprocessor with audioContext. IOS give me error description like this. `2023-07-24 11:35:57.436444+0900 CHeKT[27066:10627891] [assertion] Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}> 2023-07-24 11:35:57.436491+0900 CHeKT[27066:10627891] [ProcessSuspension] 0x1060089f0 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'WebKit Media Playback' for process with PID=27071, error: Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)} 2023-07-24 11:35:57.436947+0900 CHeKT[27066:10627891] [assertion] Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}> 2023-07-24 11:35:57.436980+0900 CHeKT[27066:10627891] [ProcessSuspension] 0x106008ae0 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'WebKit Media Playback' for process with PID=27066, error: Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)} 2023-07-24 11:35:57.437323+0900 CHeKT[27066:10627891] [assertion] Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}> 2023-07-24 11:35:57.437354+0900 CHeKT[27066:10627891] [ProcessSuspension] 0x106008bd0 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'WebKit Media Playback' for process with PID=27072, error: Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}` even gives me that error. audioWorkletProcessor is working, but when I try to access microphone with getUserMedia() method, audioWorkletProcessor sound is broke like robot sound. audioWorklet is not working with IOS? I need to develop 2way audio using with audioWorklet. but It is not possible to make 2way audio in IOS (Android is working well) Please let me know if you have any feedback or solutions. Thanks. Bobby.
Posted
by bobbykim.
Last updated
.
Post not yet marked as solved
1 Replies
709 Views
Problem Description This HLS video https://lf3-vod-cdn-tos.douyinstatic.com/obj/vodsass/hls/main.m3u8 starts with noise at 22 seconds play directly on MacOS 12.6.6 Safari,and it also appears on iOS (16.5.1) safari. But there is no noise when playing with MSE on Mac by the third-party open source web playe such as hls.js on Safari. Test tool hls.js test demo: https://hlsjs.video-dev.org/demo/
Posted Last updated
.
Post not yet marked as solved
0 Replies
449 Views
Could you provide guidance on how to add chapter marks to an M4A. I've been attempting bookmark. From what I've read, it requires the use of AVMetadataKey.quickTimeUserDataKeyChapter track.addTrackAssociation(to: ... type: .chapterList) or both. I've looked into AVTimedMetadataGroup but I havent found a way to get it added based on the documentation. I also havent found anyone who has used native Swift to add chapter marks. They've always given in and used ffmpeg or some other external solution. inputURL is for the file that is being read in outputURL is for the the final file chapters is an array of dictionaries, where time is the start of each chapter and its name in the list The target is macOS import AVFoundation class AudioChapterCreator { // Function to create an audio file with chapters and a chapter list func createAudioFileWithChapters(inputURL: URL, outputURL: URL, chapters: [(time: CMTime, title: String)]) { let options = [AVURLAssetPreferPreciseDurationAndTimingKey: true] let asset = AVURLAsset(url: inputURL, options: options) let durationInSeconds = CMTimeGetSeconds(asset.duration) print("asset durationInSeconds: \(durationInSeconds)") guard let audioTrack = asset.tracks(withMediaType: .audio).first else { print("Error: Unable to find audio track in the asset.") return } // Create metadata items for chapters let chapterMetadataItems = chapters.map { chapter -> AVMetadataItem in let item = AVMutableMetadataItem() // this duration is just for testing let tempDur = CMTime(seconds: 100, preferredTimescale: 1) item.keySpace = AVMetadataKeySpace.quickTimeUserData item.key = AVMetadataKey.quickTimeUserDataKeyChapter as NSString item.value = chapter.title as NSString item.time = chapter.time item.duration = tempDur return item } // Create an AVAssetExportSession for writing the output file guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { print("Error: Unable to create AVAssetExportSession.") return } // Configure the AVAssetExportSession exportSession.outputFileType = .m4a exportSession.outputURL = outputURL exportSession.metadata = asset.metadata + chapterMetadataItems exportSession.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: asset.duration); // Export the audio file exportSession.exportAsynchronously { switch exportSession.status { case .completed: print("Audio file with chapters and chapter list created successfully.") case .failed: print("Error: Failed to create the audio file.") case .cancelled: print("Export cancelled.") default: print("Export failed with unknown status.") } } } }
Posted
by MisterE.
Last updated
.
Post not yet marked as solved
1 Replies
542 Views
Whenever I try to make a playlist with a lot of songs all at once, I'll get to a point it almost freezes. Like how after I click add to playlist for each song a notice comes up saying "song added", but it will stop doing that and the song won't show up on the playlist. Then maybe 2 or 3 minutes later it will show the notice the song has been added. Anyone else dealing with this? Frustrating when I'm trying to do big playlists and I have to come back to it the next day to add the rest.
Posted Last updated
.
Post not yet marked as solved
0 Replies
834 Views
Hello developers, we have an issue with opening an Apple MPEG-4 audio file that apparently has a correct header but then no actual audio data. This file is 594 bytes and freezes completely the app's main thread and never returns from either of these calls: NSURL *fileURL = [NSURL fileURLWithPath:filePath]; NSError *error; AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error]; // freez (call stack below) AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:fileURL error:&error]; // freez AudioFileID audioFileID; OSStatus result = AudioFileOpenURL((__bridge CFURLRef)fileURL, kAudioFileReadPermission, 0, &audioFileID); // freez Putting the debugger in pause reveals where it is stuck: #0 0x00007ff81b7683f9 in MP4BoxParser_Track::GetSampleTableBox() () #1 0x00007ff81b76785a in MP4BoxParser_Track::GetInfoFromTrackSubBoxes() () #2 0x00007ff81b93fde5 in MP4AudioFile::UseAudioTrack(void*, unsigned int, unsigned int) () #3 0x00007ff81b93ab2c in MP4AudioFile::OpenFromDataSource() () #4 0x00007ff81b72ee85 in AudioFileObject::Open(__CFURL const*, signed char, int) () #5 0x00007ff81b72ed9d in AudioFileObject::DoOpen(__CFURL const*, signed char, int) () #6 0x00007ff81b72e1f0 in AudioFileOpenURL () #7 0x00007ffa382e8183 in -[AVAudioPlayer initWithContentsOfURL:fileTypeHint:error:] () With either of 3 calls the call stack is a little bit different but all in the end get stuck forever in MP4BoxParser_Track::GetSampleTableBox() I'm attaching the incriminated audio file to the post (just rename it back to .m4a): Audio_21072023_10462282.crash How can we avoid this and verify that an audio file is openable and playable. Before, we were checking if a file that we belive be an audio contains data inside, if true then we create AVAudioPlayer with it and see if it return no errors and if the duration is >0. This bug breaks this fondamental logic and now we added a hotfix hack to check if the data is at least 600 bytes long. How do we correctly solve this if none of the methods above return any error but instead ALL hang?
Posted
by Shebuka.
Last updated
.