Hi,
we have .pkg install package consisting of various sub packages. One of them contains presets and needs to be installed the the default preset location /Library/Audio/Presets. If this non-binary preset package is the only one in a .pkg choice notarization fails with:
"logFormatVersion": 1,
"jobId": "*",
"status": "Invalid",
"statusSummary": "Archive contains critical validation errors",
"statusCode": 4000,
"archiveFilename": "mypackage.pkg.zip",
"uploadDate": "2024-08-22T21:24:03.251Z",
"sha256": "*",
"ticketContents": null,
"issues": [
{
"severity": "error",
"code": null,
"path": "mypackage.pkg.zip",
"message": "Package mypackage.pkg.zip has no signed executables or bundles. No tickets can be generated.",
"docUrl": null,
"architecture": null
},
{
"severity": "warning",
"code": null,
"path": "mypackage.pkg.zip/mypackage.pkg",
"message": "b\"Invalid component package: mypackage_vstpreset Distribution file's value: #com.mycompany.mypackage.vstpreset.pkg\\n\"",
"docUrl": null,
"architecture": null
}
]
}
Not sure, but maybe its worth noting that the causing sub packge only generates a warning, but the parent package seems to escalate this into an error.
How can a non-binary sub package be included in a notarized parent package?
Any hints or thoughts are highly appreciated, Thanks!
Post
Replies
Boosts
Views
Activity
Hi,
we have multiple threads in our CoreAudio server plugin carrying out necessary asynchronous work (namely handling USB callbacks and shuffling the required data to the IO).
Although these threads have been set up with the appropriate THREAD_TIME_CONSTRAINT_POLICY (which actually improves it) - on M* processors there is an extremely high, non-realtime amount of jitter of >10ms(!)
Now either the runloop notification from the USB stack comes that late or the thread driving the runloop hasn't been set up to correctly handling the callbacks in a timely manner.
Since AudioUnits threads requiring to comply to the frame deadlines can join the workgroup of the audio device is there a similar opportunity for the CoreAudio server plugin threads? And if so, how should these correctly be set up?
Thanks for any hints! Or pointing me to the docs :)
How can an app obtain the valid range for setting thread_time_constraint_policy_data_t for thread_policy_set()?
How can a Dext be activated from C++. Specifically we need to load our Dext from our c++ CoreAudio server plugin.
I guess it all boils down to how to call the OSSystemExtensionRequest Swift interface from C++...
Thanks for any hints!
While most of C++20 seems to be available 'syncstream' is missing (Xcode Version 15.2 (15C500b), macOS SDK 14.2),
CLANG_CXX_LANGUAGE_STANDARD= c++20
How can this be made available?
At least under macOS Sonoma 14.2.1 kAudioFormatFlagIsBigEndian for 24bit audio doesn't seem to be supported by the CoreAudio engine when providing kAudioServerPlugInIOOperationWriteMix streaming buffers for our CoreAudio server plugin.
Is that correct and to be expected? Or how should the AudioStreamBasicDescription be filled out on a kAudioStreamPropertyPhysicalFormat request to correctly announce 24bit big endian audio to CoreAudio?
Thanks, hagen.
Hi,
our CoreAudio server plugin provides the standard kAudioVolumeControlClassID, kAudioMuteControlClassID, kAudioSoloControlClassID incl. kAudioDataSourceControlClassID.
But it looks like controllers can be created in a general way. Due to signal processing capabilities of our device it could provide way more controllers, but would there be any application that is able to present those generic controllers?
Will Audio MIDI Setup.app or AU Lab be able to display those? Any DAW?
Thanks,
hagen
Since we have to en/decode the audio stream to/from our audio device anyway and we are using NEON SIMD to do so, we could just convert it into a stream of float on the fly.
Since floats are the natural CoreAudio data format we probably can avoid having to involve an additional int-float/float-int conversion by CoreAudio this way.
Does this make sense?
Thanks,
hagen
Hi,
to be able to receive IOServiceAddMatchingNotification we need to attach to an appropriate CFRunLoop/IONotificationPort. To avoid race condition the matching notification ideally would be serialized with the CoreAudio notification/callbacks.
How can this be achieved? Attaching it to the runloop returned by CFRunLoopGetCurrent() does not yield to any notifications at all, to CFRunLoopGetMain leads to notifications asynchronous to CoreAudio callbacks.
There are a set of deprecated AudioHardwareAdd/RemoveRunLoopSource() but apart of its deprecation at least on Big Sur @ Apple Silicon this does not lead to any notification as well.
So, how is this supposed to be implemented? Do we really need to introduce locks? Also on the process calls? Wasn't it the purpose of runloops to manage exactly those kinds of situation? And more importantly over everything: Where is the documentation?
Thanks for any hints,
all the best,
hagen.
The very little and outdated 'documentation' shared by Apple about CoreAudio and CoreMIDI server plugins suggested to use syslog for logging.
At least since Bug Sur syslog doesn't end up anywhere.
(So, while you seem to think its OK to not document your APIs you could at least remove not working APIs then! Not to do so causes unnecessary and frustrating bug hunting?)
Should we replace syslog by unified logging?
For debugging purpose only our plugins write to our own log files. Where can I find suitable locations? Where is this documented?
Thanks,
hagen.
I understand it's much more fun to implement new features, than to debug. However since generations Xcode is deteriorating. There are uncountable bugs, the most annoying (C++) though are:
right click "Jump to Definition" rarely works,
click and jump to the causing line of code for build errors nearly never works,
The internet is full of workarounds, however those strongly depend on moon phase, formation of chem trails and temporal karma.
How to get those things to work once for all?
I think those things are used to work in Xcode 3 or something. Every new release presents less working parts and tons of new features serious developer rarely need.
Hi,our audio plugins communicate with a controlling hardware device via Apples IOHIDDevice API, which according tohttps://developer.apple.com/library/archive/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.htmlshould be sandbox excepted by the com.apple.security.device.usb key, which I have test-wise placed and enabled, but communication can’t be established from inside a sandboxed DAW (checked with Ableton Live).
Any expertise on this?
Thanks & cheers,Hagen.
Hi,we have a notarized and stapeld audio plugin .pkg installer.How can we make sure (test) that our .pkg and its (various audio plugin formats) ingredients are correctly notarized and that (offline) users are able to install and run? How can we verify that it would fail otherwise?Thanks,Hagen.
When opening the USB device in response to the first match notification, the USB interface iterator (since 10.14) does not return a USB interface anymore.However, if the USB device is already connected at the time calling IOServiceAddMatchingNotification(), and therefor it is iterated with its resulting iterator (without match notification) then the interfaces are found and from then on the USB interface iterator works.Once the interface iterator has successfully returned a USB interface, it now also works as a reaction to USB match notification. (Therefore a reboot is necessary to reproduce the problem once the iterator found an USB interface).Interestingly, it doesn't seem to be a simple timing problem, because a delay doesn't help, but a BP does...Any suggestion is highly recommend.Thanks!