Post

Replies

Boosts

Views

Activity

NSExtensionPrincipalClass and AU extension issue on iOS
I'm trying to remove AU extension storyboard from my project as I do all UI stuff programatically. So in the info.plist I replaced this: <key>NSExtensionMainStoryboard</key> <string>MainInterface</string> <key>NSExtensionPointIdentifier</key> <string>com.apple.AudioUnit-UI</string>With this: <key>NSExtensionPointIdentifier</key> <string>com.apple.AudioUnit-UI</string> <key>NSExtensionPrincipalClass</key> <string>MyAUViewController</string>However, I found that when trying to load the AU in any iOS host they hang forever.Any ideas why this might be the case? Is this a supported way of loading the AU extension UI?
2
0
1.7k
Feb ’17
Providing HostTime when setting AUParameter values from UI
I'm wondering if it's ok to pass 0 for hostTime in this API? (void)setValue:(AUValue)value originator:(AUParameterObserverToken __nullable)originator atHostTime:(uint64_t)hostTime eventType:(AUParameterAutomationEventType)eventType API_AVAILABLE(macos(10.12), ios(10.0), watchos(3.0), tvos(10.0)); Considering the following method doesn't take hostTime as an argument, I'm assuming either AUParameter is aware of hostTime (which I doubt) or it's passing some default value like 0 or -1. (AUParameterObserverToken)tokenByAddingParameterObserver:(AUParameterObserver)observer;
0
0
825
Jul ’20
What happens when 2 iOS apps contain AU extensions with exact same AudioComponentDescription?
I'm considering releasing a new version of the app as a separate app on the App Store. If both apps contain AU extensions with plists providing the same componentSubType and componentManufacturer, will the system guarantee to detect only the AU extension with the latest version? This is important because I want the users to reopen their projects in hosts and have the newer version of the plug-ins loaded.
1
0
936
Jul ’20
Touches/Gestures in SwiftUI and UIView(Controller)Representable aren't recognised at the same time
I have controls inside both SwiftUI views as well as UIKit VCs and views that are embedded inside UIView(Controller)Representable. In my (music) app it's common for the user to interact with multiple controls at the same time, e.g. tweaking knobs or moving sliders. I've noticed that controls that are inside UIKit recognise touches/gestures at the same time no problem. Same for controls inside SwiftUI. But if one control is inside SwiftUI view and the other is inside a UIKit view then only the one that's touched first registers touches. Is this a known issue/limitation of mixing UIKit and SwiftUI views? I can't find a mechanism/API which would let me specify not to prevent simultaneous touches/gestures being detected.
0
1
913
Nov ’20
Mac Catalyst + Audio Unit v3 plug-in not working as expected.
I'm having trouble getting my iPad app / AUv3 synth working on macOS via Mac Catalyst. The app works fine in standalone mode but the DAWs aren't able to load. Error both from GarageBand and Logic Pro are too cryptic to decipher what's going wrong. This is on Big Sur. This is from Logic Pro auval: validating Audio Unit Mela 2 by Nikolozi:     AU Validation Tool     Version: 1.8.0      Copyright 2003-2019, Apple Inc. All Rights Reserved.     Specify -h (-help) for command options -------------------------------------------------- VALIDATING AUDIO UNIT: 'aumu' - 'Mel2' - 'NKLZ' -------------------------------------------------- Manufacturer String: Nikolozi AudioUnit Name: Mela 2 Component Version: 1.6.0 (0x10600) * * PASS -------------------------------------------------- TESTING OPEN TIMES: COLD: FATAL ERROR: OpenAComponent: result: 4,0x4 validation result: couldn’t be opened From GarageBand logs I have this (this happens when I try to load the synth as an instrument plug-in): 2021-06-13 10:23:12.078357+0400 GarageBand[99801:5732544] [lifecycle] [u 589AF1E2-2BE5-451F-A613-EC9BA71325E9:m (null)]  [com.nikolozi.Mela.InstrumentExtension(1.0)] Failed to start plugin; pkd returned an error: Error Domain=PlugInKit Code=4 "RBSLaunchRequest error trying to launch plugin com.nikolozi.Mela.InstrumentExtension(589AF1E2-2BE5-451F-A613-EC9BA71325E9): Error Domain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7faa9232ca40 {Error Domain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed with error: 153}}}" UserInfo={NSLocalizedDescription=RBSLaunchRequest error trying to launch plugin com.nikolozi.Mela.InstrumentExtension(589AF1E2-2BE5-451F-A613-EC9BA71325E9): Error Domain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7faa9232ca40 {Error Domain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed with error: 153}}}} 2021-06-13 10:23:12.078518+0400 GarageBand[99801:5732544] [plugin] Unable to acquire process assertion in beginUsing: with plugin identifier: com.nikolozi.Mela.InstrumentExtension, killing plugin 2021-06-13 10:23:12.078814+0400 GarageBand[99801:5732544] [plugin] PlugInKit error in beginUsing: with plugin identifier: com.nikolozi.Mela.InstrumentExtension, killing plugin 2021-06-13 10:23:12.153420+0400 GarageBand[99801:5730475] Failed to instantiate AU. Description: RBSLaunchRequest error trying to launch plugin com.nikolozi.Mela.InstrumentExtension(589AF1E2-2BE5-451F-A613-EC9BA71325E9): Error Domain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7faa9232ca40 {Error Domain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed with error: 153}}} Reason: (null) I've tried Apple's sample code AUv3Filter. And turned on Mac Catalyst for the AUv3Filter iOS target. And it runs fine in Logic Pro. I'm not sure what's incompatible in my code that fails to work as AUv3 in Mac catalyst. Any known issues for Mac Catalyst+AUv3 combo that I should be aware of / investigate?
3
0
2.6k
Jun ’21
Receiving Repeated MIDI Events inside internalRenderBlock
When I run my AUv3 synth inside a host on an iPad under certain conditions I'm receiving repeated MIDI events. Still figuring out what the exact trigger is. Could be a CPU overload, not sure yet. Thought I'd ask here if anyone else has ideas as to what might be going on. After inspecting the passed in variable AURenderEvent* realtimeEventListHead, it looks like it contains MIDI events that were already handled in previous callbacks. These MIDI events have timestamps that are older than the passed in AudioTimeStamp* timestamp. I sometimes receive the same events like 8 times in a row. i.e. in 8 render callbacks. And these events all have the same timestamp. So I'm not sure why I'm receiving them again. Could system be assuming they weren't handled and is sending them again? I'm on iOS 15. (Not sure if this is happening on iOS 14 also. Don't have an iOS 14 device to test it on.) I reproduced this issue, both in AUM and GarageBand.
0
0
922
Sep ’21
"dyld: Symbol not found" crash when I run my app using Xcode 14 because can't find CGRect.divided(...)
I'm getting a crash when I try to run my app using Xcode 14. From the console: dyld[21862]: Symbol not found: _$sSo6CGRectV12CoreGraphicsE9__divided5slice9remainder10atDistance4fromySpyABG_AiC7CGFloatVSo0A4EdgeVtF   Referenced from: **path to my framework**   Expected in: /usr/lib/swift/libswiftCoreGraphics.dylib The code that causes crash is: keyLabel.frame = bounds.divided(atDistance: labelHeight, from: .maxYEdge).slice It seems it can't find that CGRect function. Additional info: I'm on macOS 12.4 and was running my app in mac-catalyst mode. The crash doesn't happen when I run it inside an iPad simulator. But crashes when I run it on my iPad Pro (with iPadOS 15.5). Everything works fine on Xcode 13.4.
6
2
5.2k
Jun ’22
How is AUv3 MIDI plug-in supposed to figure out the sample rate of the host?
For an AUv3 plug-in of type kAudioUnitType_MIDIProcessor how is the plug-in supposed to figure out the sample rate of the host? I think this might be an API limitation. For instruments and effects, we can read the AUAudioUnit's output bus format and check the sample rate. But MIDI FX plug-ins have no audio I/O. And even if I set them, Logic Pro, for example, doesn't update the AU if I change the sample rate in Logic. Looking at AVAudioSession.sharedInstance.sampleRate doesn't work either, because the host can have a different sample rate from the hardware. Is there a solution to this that I'm not aware of?
3
0
1.3k
Dec ’22
Download links to Mixing Swift and C++ Projects are broken
I'm getting errors like this: <Error> <Code>AccessDenied</Code> <Message>Access Denied</Message> <RequestId>P4P83RVQKMQQJYBV</RequestId> <HostId> q97d69C6z+0JXEZ8vAQg9QZXUbNaH/umTBIy09FZ7EEDOTWLlX9IQzID4uBHv4Nkq3kF/2SMAHk= </HostId> </Error> Here are the links: https://developer.apple.com/documentation/swift/mixingswiftandc++inanxcodeproject https://docs-assets.developer.apple.com/published/305dd41cfdb1/MixingSwiftAndC++InAnXcodeProject.zip
2
0
810
Jun ’23