Post

Replies

Boosts

Views

Activity

Reply to AudioServerPlugIn: HALC_ProxyObject::GetPropertyDataSize ('stm#', 'inpt', 0, AI32) error ('who?')
Thanks. As far as I can see (from following in Console.app after kickstart coreaudiod), this happens when "the system" is enumerating or activating the available audio devices, but who is actually responsible for that query is not clear (the logs seems to be spit out in bulk after CoreAudio has finished all its device queries). Can anyone else replicate the noted behavior running Apple's NullAudio device? Because right now I cannot even clearly identify *which device* the failed query is associated with...
Jun ’20
Reply to How to measure kAudioDevicePropertyIOCycleUsage with asymmetric cores?
Some testing on actual hardware seems to indicate that the IOProcs (which default to the workgroup of their respective device) seem to end up on the performance cores. If I could force the scheduler to run my benchmark either on the same cores as the IOProcs will, *or* force it to the efficient cores (that way I will have increased latency but at least it wouldn't matter where the actual IOProcs are scheduled, I will always have enough leeway). But as far as I'm aware, macOS doesn't have any "core affinity" APIs. As my benchmark is triggered in response to a UI interaction, I'd assume it runs w/ high QoS (as it's user-interactive), so currently it's probably also on the performance cores...
Jul ’20
Reply to DTK Forums
The tag for the private DTK forum shows up in the menu that appears when you click on your profile avatar in the top right-hand corner (when logged into an account that was granted access). At least, that seems to be the case for me... :)
Jul ’20
Reply to macOS AudioUnit APIs called from within Python extension fail to see AudioUnits
I can think of two potential problems you may be running into: By using /usr/bin/python (which is part of the system), your code is running with SIP (System Integrity Protection) enabled and therefore in a more restrictive environment (certain dyld environment variables filtered out). Maybe python refuses to load / run non-system .dylibs (as otherwise they would be run with the rights of the signed system process)? I seem to recall python loading its extensions fairly restrictively (RTLD_LOCAL among other dlopen-flags, but check the source); maybe that causes havoc with proper enumeration?
Jul ’20
Reply to How to measure kAudioDevicePropertyIOCycleUsage with asymmetric cores?
I'll try to explain again: My app is forwarding audio from one (virtual) input device to (real) output device. This process is driven by the IOProc of the output device. As I want to incur as little latency as possible (i.e. use the most up-to-date data from the input device), I want my IOProc to be scheduled as late as possible while still hitting the deadline (see the documentation for kAudioDevicePropertyIOCycleUsage - https://developer.apple.com/documentation/coreaudio/kaudiodevicepropertyiocycleusage the header docs are more helpful, the web documentation seems useless). But I do have a bit of work to do to the data before it's ready, so I need to figure out what I can set kAudioDevicePropertyIOCycleUsage to and still hit the deadline (from the documentation you get scheduled the entire cycle of the audio duration with the default IOCycleUsage). So far, I've been benchmarking the workload how long my audio fiddling needs (on some synthetic input data) before starting the IOProc, and then multiplied that by a safety factor and adjusted kAudioDevicePropertyIOCycleUsage accordingly. There are no other threads involved in this work (so the AudioWorkGroup APIs shouldn't be needed), but I'm wondering how to best do the estimation of the IOCycleUsage now that differently speedy cores are in play. Either I need to benchmark on the performance cores (and then the IOProc always should run on the performance cores, or I need a safety factor of how much slower the efficiency cores are), or I need to force the benchmarking onto the efficiency cores (that way I'll never be late, but I might leave some latency on the table). Does that make some sense?
Jul ’20
Reply to What kind of Apple Developer Certificate is required to create a custom virtual audio device using Core Audio?
From what I can tell (as I'm doing the same thing), you don't need a special entitlement because the AudioServerPlugIn is not an app extension but a completely stand-alone plug-in. Whether that plug-in needs to be signed with a Developer ID (or notarized) I'm actually not sure about (because AudioServerPlugIns are already sand-boxed). FWIW, I'm doing that (i.e. signing it) with a normal (paid) Developer ID and it seems to work for other people, although setting up the whole process if fraught with potential for errors. I'm not distributing via the App Store, but if you were, you'd have to install your plug-in using the normal mechanism (Apple Installer, or maybe manually copying it to /Library/Audio/Plug-Ins/HAL), but I'm not sure that's encouraged (allowed?) for AppStore apps.
Dec ’20
Reply to kAudioChannelLayoutTag_TMH_10_2_full has HI and VI channels
I did some googling and my current guess is that these channels contain content for Hearing Impaired (with emphasis on dialog) Visually Impaired (audio description) The mention I found was https://www.isdcf.com/papers/ISDCF-Doc4-Audio-channel-recommendations.pdf, which doesn't seem related to the THM format (but documentation for that seems to be rather sparse) but still related to multi-channel audio.
Dec ’20