Post

Replies

Boosts

Views

Activity

Getting USB device properties on iOS
Hi,I'm working with USB Midi devices on iOS apps, and I'm currently looking for a way to identifiy them.I know for a fact that getting the different MIDI properties with CoreMIDI API is not sufficiently reliable, as some devices return only generic information like "USB MIDI DEVICE" but are nonetheless recognized by other apps.After some research, it appears that the the best solution is to get the USB-ID of the device (VID-PID : vendor ID and product ID). My question is the following : How can I get the USBID of a USB MIDI device on iOS? The IOKit API is not available on iOS, so what is the solution to access USB devices on iOS?Thank you all for the attention,Best,Thomas
2
0
2.7k
Nov ’16
Audio Units for DAWs
Hello all,I've been working with custom AudioUnit for AVAudioEngine on iOS for a while (Audio Unit v3), and I now want to build an Audio Unit plug-in for macOS DAWs (Logic Pro or Ableton Live for example). I've been looking about the subject for a while, but I can't manage to understand how it works.It seems that part of the solution is Audio Unit Extensions, as I saw in WWDC 2015 - 508 and WWDC 2016 - 507, but I don't get how I can reuse the .appex product in Logic Pro or Ableton Live. I've used 3rd party audio units for these softwares but I always get them as .components bundles, which is to be copied in the /Library/Audio/Plug-Ins directory. I tried to copy the .appex in the same folder but it doesn't seem to be recognized by Logic Pro.Any idea about what am I missing here?Thank to you all 🙂Tom
1
0
1.5k
Apr ’18
Exception at AVAudioEngine startup
Dear all,I've been working, for quite a while now, on the migration of all my iOS musical apps from AUGraph/AudioToolbox to AVAudioEngine/AVFoundation.We released an app recently with this update and discovered a new source of exceptions intoduced by AVAudioEngine (logs with Fabric's Crashlytics).The app is basically an audio player (with lots of fancy features requiring signal processing code embedded in a custom audio unit) and provides background audio. The AVAudioSession is in Payback category with no option, as it's not supposed to mix with other sources. Everything is embedded in a single custom AudioUnit directly connected to the OutputNode inside an AVAudioEngine.I have two new weird exceptions I don't know how to interpret, especially as they are not reproductible at all, and concerns mostly old devices.1. At the end of an interruptionWhen I receive the InterruptionBegan notification, I stop the AVAudioEngine.When I receive the InterruptionEnded notification, I start back the AVAudioEngine.This last action causes, sometimes, an exception with code 560557684 (AVAudioSessionErrorCodeCannotInterruptOthers). This happens mostly on iPhone 5. But as I said, it is not reproductible and works like a charm most of the times.I was able to reproduce it once on an iPhone 5 by starting the app, starting a song, putting the app on background, and then calling the phone through FaceTime. I got the error during the InterruptionEnded notification, even though no other app was running at that time.2. At startupThis one is the weirdest for me. Sometimes, starting the AVAudioEngine during the app startup (inside didFinishLaunchingWithOptions) causes the same error (560557684: AVAudioSessionErrorCodeCannotInterruptOthers). This happen mostly on iPhone 4s and iPhone 5, and I never managed to reproduce it.The only page I found about a similar problem is the following:https://stackoverflow.com/questions/29036294/avaudiorecorder-not-recording-in-background-after-audio-session-interruption-endBut the solutions here are not very satisfying as :- I don't want my app to mix with others, and once again, it all works most of the time.- My app already uses remote control events so this doesn't solve anything.My questions :- Did anybody encounter a similar problem?- I don't event know how to start to deal with that, any advice?Thank you 🙂Tom
2
0
2.2k
Aug ’18
Impossible to control AAC parameters with AVAudioFile writer
Hello all, I'm trying to write an AAC-compressed audio file in M4A or CAF file format with AVAudioFile. I'm using the following settings dictionary: settings = [AVFormatIDKey: kAudioFormatMPEG4AAC, 						AVSampleRateKey: 48000.0, 						AVNumberOfChannelsKey: 2, 						AVEncoderBitRateStrategyKey: AVAudioBitRateStrategy_Variable, 						AVEncoderBitRatePerChannelKey: 64, 						AVEncoderAudioQualityForVBRKey: AVAudioQuality.high, 						AVEncoderBitDepthHintKey: 16] as [String : Any] Output file is OK, but my issues are changing the bit rate strategy, the bit rate value or the encoder audio quality has no impact whatsoever on the output file, switching from CAF to M4A file format with the same settings make a big change in the file size but it should not. My question is simple: Is there an issue with my settings dictionary or is it a known issue with AVAudioFile? I spent quite some time exploring how to build this dictionary on the web and the Apple documentation (very limited on this topic I must say), but I can't find what I do wrong... More details about this issue are available here : https://gitlab.com/AudioScientist/avaudiofileaacparameters Thank you for your help :)
1
0
980
Aug ’20
Archiving for mac catalyst: Unable to find a destination matching the provided destination specifier
Hello, I export xcframeworks for iOS, iOS simulator and macCatalyst with a script calling xcodebuild commands: archive for iphoneos archive for iphonesimulator archive for maccatalyst create xcframework from the three frameworks with dSyms and BSCymbolMaps. I'm struggling with one of my projects refusing to archive for macCatalyst. The macCatalyst archiving command is basically the following (same for all my projects): xcodebuild archive -project $PROJECT_PATH -scheme $SCHEME -destination "generic/platform=macOS,variant=Mac Catalyst" [...] The error I obtain is the following (formatted with xcpretty, I replaced sensitive info with ***): ▸ xcodebuild: error: Unable to find a destination matching the provided destination specifier: ▸ { generic:1, platform:macOS, variant:Mac Catalyst } ▸ Available destinations for the "***" scheme: ▸ { platform:macOS, arch:arm64, variant:Mac Catalyst, id:*** } ▸ { platform:macOS, arch:x86_64, variant:Mac Catalyst, id:*** } ▸ { platform:macOS, arch:arm64, variant:Designed for [iPad,iPhone], id:*** } ▸ { platform:iOS Simulator, id:***, OS:15.5, name:test-simulator } ▸ Ineligible destinations for the "***" scheme: ▸ { platform:iOS, id:dvtdevice-DVTiPhonePlaceholder-iphoneos:placeholder, name:Any iOS Device } ▸ { platform:iOS Simulator, id:dvtdevice-DVTiOSDeviceSimulatorPlaceholder-iphonesimulator:placeholder, name:Any iOS Simulator Device } I compared with another project building fine, the only difference I found is the deployment targets (the project not working needs iOS 14.0+): Project archiving for macCatalyst: Project NOT archiving for macCatalyst: In Xcode the "any mac" destination is available for the first project: but not for the second: I have the same issue on Intel and Apple Silicon macs, with Xcode 13.2, 13.3 and 13.4. I tried to change the deployment targets of the first project to reproduce the issue, this project still archives for macCatalyst without any issue, even though I can not find any other difference in the two projects settings, which really bugs me 😭 I tried to add name=Any Mac in the destination, as seen here and there on the web for this issue, it did not help. Any help would be really appreciated 🤗
0
0
1.8k
Aug ’22
AVAssertExportSession non deterministic for m4a files
Hello, I encountered an issue with AVAssertExportSession. My iOS app uses an AVAssertExportSession with AVAssetExportPresetPassthrough to export audio files from the media library, and use it later. I noticed that if I export the same m4a file several times, the resulting file is never the same. The exported file sounds exactly the same, but there are a few different bytes between the exported files. This does not seem to be the case for other formats (wav and aiff for example). This is actually an issue for me for several reasons. Do you know if this is an expected behaviour or a bug? Is there a way to obtain the same output every time in the case of m4a files? Thank you for your help 🙏
2
0
681
Mar ’23
AVAudioFile fails to read some files with wrong extension
Hello, Here is an issue I encountered recently. Does anybody have feedback on this? Issue encountered AVAudioFile throws when opening WAV files and MPEG-DASH files with .mp3 extension, works fine with many other tested combinations of formats and extension (for example, an AIFF file with .mp3 extension is read by AVAudioFile without error). The Music app, AVAudioFile and ExtAudioFile all fail on the same files. However, previewing an audio file in Finder (select the file and hit the space bar) works regardless of the file extension. Why do I consider this an issue? AVAudioFile seems to rely on extension sometimes but not always to guess the audio format of the file, which leads to unexpected errors. I would expect AVAudioFile to deal properly with wrong extensions for all supported audio formats. ⚠️ This behaviour can cause real trouble in iOS and macOS applications using audio files coming from the user, which often have unreliable extensions. I published some code to easily reproduce the issue: https://github.com/ThomasHezard/AVAudioFileFormatIssue Thank you everybody, have a great day 😎
0
2
760
May ’23
AVAudioEngine: audio input does not work on iOS 17 simulator
Hello, I'm facing an issue with Xcode 15 and iOS 17: it seems impossible to get AVAudioEngine's audio input node to work on simulator. inputNode has a 0ch, 0kHz input format, connecting input node to any node or installing a tap on it fails systematically. What we tested: Everything works fine on iOS simulators <= 16.4, even with Xcode 15. Nothing works on iOS simulator 17.0 on Xcode 15. Everything works fine on iOS 17.0 device with Xcode 15. More details on this here: https://github.com/Fesongs/InputNodeFormat Any idea on this? Something I'm missing? Thanks for your help 🙏 Tom PS: I filed a bug on Feedback Assistant, but it usually takes ages to get any answer so I'm also trying here 😉
6
5
2.9k
Sep ’23
Give microphone permission programmatically
Hello, I am setting up macMinis as CI machines (using gitlab-runner) for my team. We are developing mostly audio stuff, and some of our unit tests imply using audio inputs with AVAudioSession/AVAudioEngine. These CI jobs trigger a microphone authorization pop-up on the macMinis, asking for permission to give gitlab-runner access to the microphone. Once the authorization is given, subsequent jobs run fine. My issue is that the macMinis are updated on a regular basis with scripts, and since the path of the gitlab-runner binary, installed with homebrew, changes on every version, the pop-up is triggered again every time gitlab-runner gets updated. Since we are having more and more CI runners, maintaining this manually is becoming impossible. Is there a way to either deactivate this security or scripting the authorization for a binary to access the microphone? Thank you for your help! Tom
1
0
167
3w