To answer myself: There's an Apple example aumf project that seems to work correctly.
Our project is packaged in a framework and loads in-process, the example loads out-of process. Ours is not found by Logic, the example is found.
Afaics that is the only difference between ours and the official example. Logic has currently got issues with these 2 methods of loading and its looking like this is another one to add to the list :( So now to work out how to package our extension so we have both options...
Post
Replies
Boosts
Views
Activity
poke, ditto here, anyone got v3 aumf working in Logic yet?
ditto - This new Git panel in XCode 15 really does appear to be buggy and unreliable, I find I myself resorting to launching Sourcetree or the terminal just to confirm what XCode has or has not actually done, which is not very comforting! I've also have situations where underneath the staging and pushing works fine but the UI doesn't update until several seconds later. I hate to say it but this situation appears to be typical, lately Apple are rushing out new and often un-needed/un-wanted "features" with ever increasing frequency and don't appear to be spending enough time on testing. Instead they leave it to us mere mortals to report the bugs for them.
From our recent testing:
Just to be clear, this is regarding Auv3 plug-ins loaded into Logic running on Mac OS.
Logic Pro : Intel version
Auv3 plug-ins have an option to be loaded in-process or out-of-process (AUv3 mode)
Loaded in-process an Auv3 plug-in can scan, find and load other Auv3 plug-ins.
Loaded out-of-process the plug cannot find or load other Auv3 plug-ins but can find and load v2 with no problem.
Logic Pro : M processor version
Auv3 are loaded out-of-process by default and can load v2 with no problem.
There is no option to load in-process and which appears to mean that Auv3 plug hosting other plug-ins is either disallowed or perhaps more likely just not implemented yet... .
There appear to be several minor and major issues regarding the way the M version of Logic Pro handles Auv3 and we have reported them on several occasions to no avail as of yet. Recently we filed a code support request as a last resort. If anything comes of it I'll report back!
deleted my reply because I didn't read your question fully!
I also noticed this happed when running auvaltool. As with all things AU, I find it better to err on the safe side and do (what should be unnecessary) checks in the render block. The flags can be useful but not always.
Also it's certainly a good idea to always assume the host app will do something horrible. IMHO ( and it's only my opinion) This day and age with CPUs being so much faster, it's not so bad to do a few checks at the start of the render block before proceeding with the main DSP ...
You could use group defaults?
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 AppKit 0x7ff80b5535c5 -[NSWindow _setKeyViewGroupBoundaryNeedsRecalc:] + 32
1 AppKit 0x7ff80b53b4e9 -[NSView _primitiveSetNextKeyView:] + 227
2 AppKit 0x7ff80b57989a -[NSView _removeNextPointersToMe] + 602
3 AppKit 0x7ff80b579428 -[NSView _removeFromKeyViewLoop] + 203
4 AppKit 0x7ff80b578ddb -[NSView _finalize] + 797
5 AppKit 0x7ff80b578986 -[NSView dealloc] + 119
6 libdispatch.dylib 0x7ff808886317 _dispatch_client_callout + 8
7 libdispatch.dylib 0x7ff808892c78 _dispatch_main_queue_drain + 943
8 libdispatch.dylib 0x7ff8088928bb _dispatch_main_queue_callback_4CF + 31
9 CoreFoundation 0x7ff808b459c7 CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE + 9
10 CoreFoundation 0x7ff808b0693f __CFRunLoopRun + 2771
11 CoreFoundation 0x7ff808b057ac CFRunLoopRunSpecific + 562
12 HIToolbox 0x7ff81178cce6 RunCurrentEventLoopInMode + 292
13 HIToolbox 0x7ff81178ca4a ReceiveNextEventCommon + 594
14 HIToolbox 0x7ff81178c7e5 _BlockUntilNextEventMatchingListInModeWithFilter + 70
15 AppKit 0x7ff80b52c5cd _DPSNextEvent + 927
16 AppKit 0x7ff80b52ac8a -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1394
17 ViewBridge 0x7ff8100581c2 __75-[NSViewServiceApplication nextEventMatchingMask:untilDate:inMode:dequeue:]_block_invoke + 111
18 ViewBridge 0x7ff810058003 -[NSViewServiceApplication _withToxicEventMonitorPerform:] + 114
19 ViewBridge 0x7ff810045bf3 -[NSViewServiceApplication nextEventMatchingMask:untilDate:inMode:dequeue:] + 151
20 AppKit 0x7ff80bcfd55e -[NSTextView _bellerophonTrackMouseWithMouseDownEvent:originalSelection:granularity:extending:rectangular:toggling:multiple:autoscrollEvent:] + 1402
21 AppKit 0x7ff80b8e8710 -[NSTextView mouseDown:] + 7259
22 AppKit 0x7ff80b72a1f1 -[NSWindow(NSEventRouting) _handleMouseDownEvent:isDelayedEvent:] + 4859
23 AppKit 0x7ff80b69e39e -[NSWindow(NSEventRouting) _reallySendEvent:isDelayedEvent:] + 2582
24 AppKit 0x7ff80b69d76e -[NSWindow(NSEventRouting) sendEvent:] + 352
25 AppKit 0x7ff80b69bb44 -[NSApplication(NSEvent) sendEvent:] + 352
26 ViewBridge 0x7ff81004666b __65-[NSViewServiceApplication sendEventWithoutCatch:withForwarding:]_block_invoke + 115
27 ViewBridge 0x7ff8100465bd -[NSViewServiceApplication eventHasNotHitWindow:actions:] + 62
28 ViewBridge 0x7ff810046337 -[NSViewServiceApplication sendEventWithoutCatch:withForwarding:] + 344
29 ViewBridge 0x7ff8100d37ce +[ViewBridgeUtilities allowSettingMousePointerImageInBackground:whilePerformingActions:] + 239
30 ViewBridge 0x7ff81004612f -[NSViewServiceApplication sendEvent:withForwarding:] + 103
31 AppKit 0x7ff80b95496b -[NSApplication _handleEvent:] + 65
32 AppKit 0x7ff80b51d35e -[NSApplication run] + 623
33 AppKit 0x7ff80b4f12b7 NSApplicationMain + 817
34 libxpc.dylib 0x7ff808785874 _xpc_objc_main + 867
35 libxpc.dylib 0x7ff808785239 xpc_main + 99
36 ViewBridge 0x7ff81003f406 -[NSXPCSharedListener resume] + 16
37 ViewBridge 0x7ff8100417e6 NSViewServiceApplicationMain + 1326
38 AUHostingServiceXPC 0x105c738c8 0x105c6d000 + 26824
39 dyld 0x10cd5d51e start + 462
Thread 1:
0 libsystem_pthread.dylib 0x7ff808a3cf48 start_wqthread + 0
…
A bit more info:
It’s beginning to look a bit like a system bug.
But when you’re developing an audio unit host the documentation that Apple provides is so poor one has to play the trial and error game far too often …
At first I thought it was possibly related to us hosting the AU view in a SwiftUI view using NSRepresentable.
So I removed SwiftUI from the equation and simply put the AU view directly into a floating panel/window.
And I still got similar crashes:
_- The issue will only crash the host if the audio unit is loaded “OUT OF PROCESS”
The exception is always thrown after two or more key events occur, whether or not the audio unit is in or out of process.
When the AU is IN PROCESS however, an exception is reported in the debugger but it appears something internal to the api must be catching the exception hence preventing a crash.
From the console crash logs it does appear that some part of the audio unit view hierarchy is being released internally …. (I’ll have to post the log in another message, due to character limits)
_
So a workaround is to host all audio units in-process which is a bit of a downside but at least for now it’s working.
I would love to know if anyone else has experience anything like this. !!
.. I think I’ll file a bug report but I don’t hold out any hope whatsoever :( ...
Answering my own question with RTFM!
Yes is it possible:
AUMIDIEventListBlock,
AUScheduleMIDIEventBlock scheduleMIDIEventBlock
kMIDIProtocol_2_0 etc .
I just stumbled upon this question while I was looking for something else, and I am making the assumption you've only recently started programming (apologies if not, but then you probably wouldn't have asked) ....
Ask yourself why would dividing 2 Ints yield 0?
Also perhaps, It's interesting that the Swift documentation doesn't often discuss the fundamental building blocks of all programming languages. It does assume the programmer already has that in their arsenal before launching themselves into the ether.
Here's a page (i found in a quick google search) that might explain what you need to know just as well as anyone here could...
[https://www.programiz.com/swift-programming/data-types]
The documentation appears to have been updated recently....
An audio unit may choose to dynamically rearrange the tree. When doing so, it must
issue a KVO notification on the audio unit's parameterTree property. The tree's elements are
mostly immutable (except for values and implementor hooks); the only way to modify them
is to publish a new tree.
But sadly it seems most of the important hosts don't appear to respond to the notification and so it's probably not a good idea to change the tree after it's created.
Some time later ... ...
In the end my solution to the compression bit-rate was to revert to using the old ExtAudioFile API with an input consisting of one or two 32 bit uncompressed float channels and utilising the kAudioConverterEncodeBitRate property on the compressed formats I needed.
The problem with the WAV files incorrect format turned out to be a bug which Apple acknowledged and is now fixed :)
appendum: It could be possible to send the UMP as byte stream in the AUScheduleMIDIEventBlock, but the docs explicitly mention:
"One or more valid MIDI 1.0 events, except sysex which must always be sent as the only event in the chunk."
NB: For my purpose, after a deeper read of the documentation, rather than set the entire shader to -fno-fast-math I've also found using the namespace metal::precise:: seems to work as expected. But my original question about the compiler setting is still valid I think. I don't consider myself an expert in MSL so I'm interested in knowing what is considered best practice etc.