Post

Replies

Boosts

Views

Activity

Audio Unit hosting: Mysterious Error thrown when key presses dispatched: (Is anyone from Apple out there ?) ...
In our audio unit host, we're seeing the following error thrown when keypresses are dispatched to an audio unit view , does anyone know what it means and more importantly what might be causing it ? .... I can only guess that perhaps an object has been released or not retained properly somewhere but cannot find a solution. This is error message when caught in the debugger _Thread 1: "assertion failed: '<AUAudioUnitRemoteViewController: 0x600002b00c60> does not conform to AUAudioUnitViewControllerHostInterface' in -[NSRemoteView ensureClientExportedObject] on line 7080 of file /AppleInternal/Library/BuildRoots/66382bca-8bca-11ec-aade-6613bcf0e2ee/Library/Caches/com.apple.xbs/Sources/ViewBridge/NSRemoteView.m"
2
0
1.4k
Apr ’22
MIDI 2 (UMP) equivalent for AUScheduleMIDIEventBlock?
We have an audio app that utilises a custom internal audio unit attached to AVAudioEngine to do DSP processing. Currently: MIDI arrives at the input port for the app (created with MIDIDestinationCreateWithProtocol). For MIDI 1 we use AUScheduleMIDIEventBlock to pass the events from the midi input to the audio unit. All works well for MIDI 1. So while we ponder on it ourselves we have some questions to throw into the ether... a) For MIDI 2 there appears to be no equivalent method to AUScheduleMIDIEventBlock to send UMP to an audio unit? b) We initially chose the audio unit approach because MIDI and audio processing is all handled neatly, but is this approach essentially redundant? Would it be better to put a tap somewhere on the AVAudioEngine and pass MIDI 2 events directly from the input to the tap? I fear in that case synchronising MIDI to audio nicely would be a pain? c) perhaps we should wait until apple implement a UMP version of AUScheduleMIDIEventBlock?
2
0
1.4k
Aug ’21
Compiler Settings: -fno-fast-math vs "Enable Fast Math"
In XCode's Metal Compiler Build Options there is a built-in field for Enable Fast Math which in XCode is NO by default.I'm working on a shader for an iOS app that exhibited drastically different behaviour on different iPad's. The error looked like a math precision issue so as the shader doesn't need to be super fast and as a test I added -fno-fast-math to XCode's Other Metal Compiler Flags and sure enough the issue appears to be cured.My question is, is this a bug in XCode or is Enable Fast Math a different option to -fno-fast-math ( I dont't imagine it is ).Or... could my issue be a result of some other difference between devices that I'm not aware of.However, like I say, the issue appears to be cured by explitly declaring -fno-fast-math in the build settings.XCode Version 11.4 (11E146)Various iPads, running iOS 11, 12 and 13.
2
0
2.4k
Apr ’20
How to specify bit rate when writing with AVAudioFile
I can currently write, using AVAudioFile, to any of the file formats specified by Core Audio.It can create files in all formats (except one, see below ) that can be read into iTunes, Quicktime and other apps and played back.However some formats appear to be ignoring values in the AVAudioFile settings dictionary.e.g: • An MP4 or AAC will save and write successfully at any sample rate but any bit rates I attempt to specify are ignored. • Wave files saved with floating point data are always converted to Int32 even though I specify float. Even though the PCM buffers I’m using as input and output for sample rate conversion are float on input and output. So the AVAudioFile is taking Float input but converting it to Int for some reason I can’t fathom. • The only crash/exception/failure I see is if I attempt to create an AVAudioFile as WAV/64 bit float. … bang, AVAudioFile isn’t having that one!The technique I’m using is: • Create AVAudioFile for writing with a settings dictionary. • Get processing and file format from AVAudioFile • Client format is always 32 bit Float, AVAudioFile generally reports its processing format as some other word sized Float format at the sample rate and size I’ve specified in the fileFormat. • Create a converter to convert from client format to processing format. • Process input data through the converter to the file using converter.convert(to: , error:&amp;error, withInputFrom )So this works … sort ofThe files ( be they wav, aiff, flac, pp3, aac, mp4 etc ) are written out and will play back just fine.… but …If the processing word format is Float, in a PCM file like WAV, the AVAudioFile will always report its fileFormat as Int32.And if the file is a compressed format such as mp4/aac, any bit rates I attempt to specify are just ignored but the sample rate appears to be respected as if the converters/encoders just choose a bit rates based on sample rate.So after all that waffle, I've missed something that's probably meant to be obvious, so my questions are … • For lpcm float formats why is Int32 data written even though the AVAudioFile settings dictionary has AVLinearPCMIsFloatKey to true ? • How do arrange the setup so that I can specify the bit rate for compressed audio?The only buffers I actually create are both PCM, the client output buffer, and the AVAudioConverter/AVAudioFile processing buffer.I’ve attempted using AVAudioCompressedBuffer but haven’t had any luck.I hope someone has some clues because I’ve spent more hours on this than anyone should ever need to!For my Christmas present I’d like Core Audio to be fully and comprehensively documented please!
3
0
4.5k
Dec ’18
Recreating AUParameterTree without destroying the tree object? Is it possible?
I have a situation where at any one time my AudioUnit can be represented by only a few parameters or literally many 100s.This dynamic situation is under the control of the user and the maximum number of parameters and their hierarchy cannot be predicted in advance. ( at least not accurately ).When the parameters are to change I am setting the parameterTree property by creating a new tree with the child nodes and posting KVC notifications ….. create childGroups … [self willChangeValueForKey:@"parameterTree"]; self.parameterTree = [AUParameterTree createTreeWithChildren: childGroups ]; [self didChangeValueForKey:@"parameterTree"];Most the of the app’s User Interface and AudioUnit is coded in Swift, the engine is in coded in C/C++ with an objectiveC AUAudioUnit class that acts as the go between hence the above.However there is a popular host app that crashes when I do this and it looks like the host is hanging on to the parameterTree object that’s it’s passed originally when the AU first launches but never updates it even after the KVC notifications are posted.So after that long explanation… Am I doing this correctly? OR Is there a solution that can create and rebuild a tree without making a new AUParameterTree object?If I can do that the host in question may not crash, ( although it might anyway because all the parameters have changed ).I have posted a code example to the host developer but sadly got a response which gave me the impression he was not prepared to work on a fix.Thanks!
11
0
2.9k
Feb ’17