I'm trying to create a simple pure MIDI Audio Unit (AUv3) that could act as a pipe between for example an AVMusicTrack
(played by an AVAudioSequencer
) and an AVAudioUnitSampler
.
I used the default audio extension template generated by XCode 13.2.1 and modified just a few things:
- My audio unit has type
kAudioUnitType_MIDIProcessor
(aumi
), from what I read it's the right candidate and the only one I can connect to a sampler. Else the app will crash with theTerminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: graphNodeSrc->IsMIDIProcessorNode()
message. Maybe I'm missing something here, any suggestion? - I overrode
handleMIDIEvent()
function in my AU's DSP Kernel just to print something when I receive MIDI events:
void handleMIDIEvent(const AUMIDIEvent &midiEvent) override {
cout << "MIDI Event" << endl;
}
- I declared a
MIDIOutputNames
properly
My goal is to follow the MIDI playing context from this AU, and edit some note messages depending on the context.
My host provides AUMIDIOutputEventBlock
, AUHostTransportStateBlock
and AUHostMusicalContextBlock
so that the AU can read the state and context and output MIDI to the host.
If I make my AU a kAudioUnitType_MusicDevice
(aumu
), I do receive note events from the music track (even if I cannot connect my AU to the sampler as it's not a MIDI processor). But with a MIDI processor, I don't. Any clue why this is?