Integrate music and other audio content into your apps.

Posts under Audio tag

83 Posts
Sort by:
Post not yet marked as solved
14 Replies
10k Views
I am experiencing an issue where my Mac's speakers will crackle and pop when running an app on the Simulator or even when previewing SwiftUI with Live Preview. I am using a 16" MacBook Pro (i9) and I'm running Xcode 12.2 on Big Sur (11.0.1). Killing coreaudiod temporarily fixes the problem however this is not much of a solution. Is anyone else having this problem?
Posted
by
Post not yet marked as solved
1 Replies
2.4k Views
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
Posted
by
Post not yet marked as solved
1 Replies
1.2k Views
How can I processing on the AudioQueue callback AudioQueueOutputCallback for FFT . Just like this func static void audioQueueOutpuCallBack(void *input, AudioQueueRef inQueue, AudioQueueBufferRef outQueueBuffer) {    SYAudioQueue *aq = (__bridge SYAudioQueue *)input;   dispatch_semaphore_wait(aq-m_mutex, DISPATCH_TIME_FOREVER);   [aq enterQueue:inQueue withBuffer:outQueueBuffer];   dispatch_semaphore_signal(aq-m_mutex); } I know that AVAudioEngine can be processing for FFT under the AVAudioPCMBuffer . or How can I convert AudioQueueBufferRef to AVAudioPCMBuffer
Posted
by
Post not yet marked as solved
11 Replies
8.9k Views
An error is reported when playing h5 audio or video elements in wkwebview: Error acquiring assertion: Error Domain=RBSAssertionErrorDomain Code=3 "Required client entitlement is missing" UserInfo={RBSAssertionAttribute=RBSDomainAttribute| domain:"com.apple.webkit" name:"MediaPlayback" sourceEnvironment:"(null)", NSLocalizedFailureReason=Required client entitlement is missing} Then the performance of the webview will become very poor. There is an audio element and a button button in my HTML file. Click the button to play audio. body button onclick="handleClick()"PLAY/button audio id="audio" src="https://ac-dev.oss-cn-hangzhou.aliyuncs.com/test-2022-music.mp3"/audio script function handleClick() { document.getElementById("audio").play(); } /script /body Create a wkwebview to load the html file in my demo APP. class ViewController: UIViewController , WKUIDelegate{ var webView: WKWebView! override func loadView() { let config = WKWebViewConfiguration() config.preferences.javaScriptEnabled = true config.allowsInlineMediaPlayback = true webView = WKWebView(frame: .zero, configuration: config) //.zero webView.uiDelegate = self view = webView } override func viewDidLoad() { super.viewDidLoad() let myURL = URL(string: "https://ac-dev.oss-cn-hangzhou.aliyuncs.com/test-2022-py.html") let myRequest = URLRequest(url: myURL!) webView.load(myRequest) } } Click the button in the HTML to play the audio, and you can see the error report on the xcode. iPadN[2133:855729] [assertion] Error acquiring assertion: Error Domain=RBSAssertionErrorDomain Code=3 "Required client entitlement is missing" UserInfo={RBSAssertionAttribute=RBSDomainAttribute| domain:"com.apple.webkit" name:"MediaPlayback" sourceEnvironment:"(null)", NSLocalizedFailureReason=Required client entitlement is missing} To sum up, this error will appear when playing audio or video in HTML. Then the app performance will drop a lot, and the interactive response will be very slow.
Posted
by
Post not yet marked as solved
1 Replies
2.2k Views
I've noticed that enabling voice processing on AVAudioInputNode change the node's format - most noticeably channel count. let inputNode = avEngine.inputNode print("Format #1: \(inputNode.outputFormat(forBus: 0))") // Format #1: <AVAudioFormat 0x600002bb4be0:  1 ch,  44100 Hz, Float32> try! inputNode.setVoiceProcessingEnabled(true) print("Format #2: \(inputNode.outputFormat(forBus: 0))") // Format #2: <AVAudioFormat 0x600002b18f50:  3 ch,  44100 Hz, Float32, deinterleaved> Is this expected? How can I interpret these channels? My input device is an aggregate device where each channel comes from a different microphone. I then record each channels to separate files. But when voice processing messes up with the channels layout, I cannot rely on this anymore.
Posted
by
Post not yet marked as solved
1 Replies
1.6k Views
I am trying to figure out why there is no audio on my iPhone or iPad, my code is working on other devices. I am on IPad iOS 15.3.1 and I test on my computer using Safari. Video is working, and both the video and audio work on Android, Chrome, etc. This is just an audio problem on iOS. From my WebRTC I have HTML5 Audio Track tracks as such: <audio muted="false" autoplay="1" id="xxxx"></audio> When debugging, I connect my IPad and have run this volume check: document.getElementById('***').volume And it returns the value of 1, so the volume is on its loudest (I think according to HTML5 audio tags range from 0, 0.1, 0.2, xxxx 1). document.getElementById('***').end The ended returns false. Next I try to run the play() function as such: $('#***')[0].play() .then((resp) => { console.log("Success"); console.log(resp) }) .catch(error => {console.log(error)}) And it executes the success response. But there is still no sound. What could be causing this issue on iOS and Safari only?enter code here
Posted
by
Post not yet marked as solved
1 Replies
1.7k Views
Hi, I have multiple audio files I want to decide which channel goes to which output. For example, how to route four 2-channel audio files to an 8-channel output. Also If I have an AVAudioPlayerNode playing a 2-channel track through headphones, can I flip the channels on the output for playback, i.e flip left and right? I have read the following thread which seeks to do something similar, but it is from 2012 and I do not quite understand how it would work in modern day. Many thanks, I am a bit stumped.
Posted
by
Post not yet marked as solved
9 Replies
3.6k Views
I am getting an error in iOS 16. This error doesn't appear in previous iOS versions. I am using RemoteIO to playback live audio at 4000 hz. The error is the following: Input data proc returned inconsistent 2 packets for 186 bytes; at 2 bytes per packet, that is actually 93 packets This is how the audio format and the callback is set: // Set the Audio format AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 4000; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; audioFormat.mFramesPerPacket = 1; audioFormat.mChannelsPerFrame = 1; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = 2; audioFormat.mBytesPerFrame = 2; AURenderCallbackStruct callbackStruct; // Set output callback callbackStruct.inputProc = playbackCallback; callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self); status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &callbackStruct, sizeof(callbackStruct)); Note that the mSampleRate I set is 4000 Hz. In iOS 15 I get 0.02322 seconds of buffer duration (IOBufferDuration) and 93 frames in each callback. This is expected, because: number of frames * buffer duration = sampling rate 93 * 0.02322 = 4000 Hz However, in iOS 16 I am getting the aforementioned error in the callback. Input data proc returned inconsistent 2 packets for 186 bytes; at 2 bytes per packet, that is actually 93 packets Since the number of frames is equal to the number of packets, I am getting 1 or 2 frames in the callback and the buffer duration is of 0.02322 seconds. This didn't affect the playback of the "raw" signal, but it did affect the playback of the "processed" signal. number of frames * buffer duration = sampling rate 2 * 0.02322 = 0.046 Hz That doesn't make any sense. This error appears for different sampling rates (8000, 16000, 32000), but not for 44100. However I would like to keep 4000 as my sampling rate. I have also tried to set the sampling rate by using the setPreferredSampleRate(_:) function of AVAudioSession, but the attempt didn't succeed. The sampling rate was still 44100 after calling that function. Any help on this issue would be appreciated.
Posted
by
Post not yet marked as solved
2 Replies
1.2k Views
We noticed iOS 16 doesn't seem to support these commands anymore: MPRemoteCommandCenter.shared().likeCommand MPRemoteCommandCenter.shared().dislikeCommand MPRemoteCommandCenter.shared().bookmarkCommand Or is there another way to show a menu in lieu of the previous button on the lock screen?
Posted
by
Post not yet marked as solved
14 Replies
2.8k Views
Hello, after updating to iOS 16.4 I have major issues when trying to play music through my 2015 BMW with both bluetooth and USB. I have used other phones with earlier iOS versions, and it works flawlessly so I know it's 16.4. I have tried restarting, updating the BMW software, and disconnecting/reconnecting, but no luck as it's certainly a 16.4 issue. The problems are as follows (for apple music, spotify, soundcloud, any audio streaming) No album artwork No song title No album title No ability to change songs unless on phone When attempting to play a song, it will only play the first ~30 seconds or so before restarting to the first song in my library. This happens over, and over, and over again 16.4 has made being able to enjoy music in my car obsolete. I have tried submitting 2 tickets in the feedback app with no response, and when I try to contact Apple they just tell me to submit feedback and are unable to help. Hoping a dev or someone sees this and is able to fix it. Thank you.
Posted
by
Post not yet marked as solved
0 Replies
443 Views
Hi, Anyone know if there are a lot of issue requiring Audio Entitlements for CarPlay ? I know a lot of developer have request these entitlements without reply from Apple. Thanks for help me! Andrea
Posted
by
Post not yet marked as solved
0 Replies
596 Views
Hello, Here is an issue I encountered recently. Does anybody have feedback on this? Issue encountered AVAudioFile throws when opening WAV files and MPEG-DASH files with .mp3 extension, works fine with many other tested combinations of formats and extension (for example, an AIFF file with .mp3 extension is read by AVAudioFile without error). The Music app, AVAudioFile and ExtAudioFile all fail on the same files. However, previewing an audio file in Finder (select the file and hit the space bar) works regardless of the file extension. Why do I consider this an issue? AVAudioFile seems to rely on extension sometimes but not always to guess the audio format of the file, which leads to unexpected errors. I would expect AVAudioFile to deal properly with wrong extensions for all supported audio formats. ⚠️ This behaviour can cause real trouble in iOS and macOS applications using audio files coming from the user, which often have unreliable extensions. I published some code to easily reproduce the issue: https://github.com/ThomasHezard/AVAudioFileFormatIssue Thank you everybody, have a great day 😎
Posted
by
Post not yet marked as solved
0 Replies
509 Views
Thank you for this new API. Today, when using AUVoiceIO, voice gets processed ahead of rendering to ensure the echo canceller is capable of filtering it out from the input. Will other audio be processed in the same way? For example, rendered as mono in a 16kHz sampling rate? I'm asking because I'm wondering if this API will unlock the ability to use wide-band, stereo, high quality other audio (for example game audio) simultaneously while using voice. Thanks!
Posted
by
Post not yet marked as solved
0 Replies
936 Views
New on iOS17 we can control the amount of 'other audio' ducking through the AVAudioEngine. Is this also possible on AVAudioSession? In my app I don't voice input, but I do play voice audio while music from other apps plays in the background. Often the music either drowns to voice, if I use the .mixWithOthers category, or it's not loud enough if I use .duckOthers. It would be awesome to have the level of control that AVAudioEngine has.
Posted
by
Post not yet marked as solved
0 Replies
601 Views
A: iPhone SE 2nd (iOS 16.5) Used bluetooth model: Shokz OpenRun S803 B: Any mobile device A uses bluetooth microphone/speaker, and make a call to B using iPhone app. Mute the A's headphone. (The bluetooth device support mute by hardware). While A mutes, B speaks. Unmute A's headphone. Every time B speaks, B can hear the echo. Since there is no audio data during the hardware muted, VPIO don't recognize audio reference data to remove echo signal. Is there any alternative to resolve this echo in VoIP software using VPIO?
Posted
by
Post not yet marked as solved
2 Replies
438 Views
I use openAl to play music. It works on iOS15. x phones, but there is no sound and no error reported on iOS16. x. Can someone help me
Posted
by