Post

Replies

Boosts

Views

Activity

AVAudioSequencer.tempoTrack not sending SetTempo messages
I'm trying to get tempo related messages from the AVAudioSequencer.tempoTrack. class TempoHandler { private var midiEndpoint = MIDIEndpointRef() private var midiClient = MIDIClientRef() private let sequencer: AVAudioSequencer public init(sequencer: AVAudioSequencer) { self.sequencer = sequencer } func setup() { var status = MIDIClientCreateWithBlock("TempoHandler.Client" as CFString, &midiClient, nil) try checkStatus(status: status, error: SequencerMIDIContextError.createMidiClient) status = MIDIDestinationCreateWithBlock(midiClient, "TempoHandler.In" as CFString, &midiEndpoint, midiIO) try checkStatus(status: status, error: SequencerMIDIContextError.createMidiDestination) sequencer.tempoTrack.destinationMIDIEndpoint = midiEndpoint } private func midiIO(_ packetList: UnsafePointer<MIDIPacketList>, _ src: UnsafeMutableRawPointer?) { let packets = packetList.pointee var packet = packets.packet for _ in 0 ..< packets.numPackets { print("TempoHandler received " + String(format:"%02X", packet.data.0)) packet = MIDIPacketNext(&packet).pointee } } } My sequencer is fully loaded with a MIDI file before I call setup() then sequencer.prepareToPlay() if that matters. The MIDI file contains multiple SetTempo meta messages that are audibly well handled by the sequencer. When instead setting my midiEndpoint as a destination to another track (sequencer.tracks[0]), my handler does output note messages. But with the tempo track, no MIDI message is received. Is there something I'm missing ?
3
0
1.4k
Mar ’22
MIDI processor passthru AudioUnit doesn't receive MIDI events
I'm trying to create a simple pure MIDI Audio Unit (AUv3) that could act as a pipe between for example an AVMusicTrack (played by an AVAudioSequencer) and an AVAudioUnitSampler. I used the default audio extension template generated by XCode 13.2.1 and modified just a few things: My audio unit has type kAudioUnitType_MIDIProcessor (aumi), from what I read it's the right candidate and the only one I can connect to a sampler. Else the app will crash with the Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: graphNodeSrc->IsMIDIProcessorNode() message. Maybe I'm missing something here, any suggestion? I overrode handleMIDIEvent() function in my AU's DSP Kernel just to print something when I receive MIDI events: void handleMIDIEvent(const AUMIDIEvent &midiEvent) override { cout << "MIDI Event" << endl; } I declared a MIDIOutputNames properly My goal is to follow the MIDI playing context from this AU, and edit some note messages depending on the context. My host provides AUMIDIOutputEventBlock, AUHostTransportStateBlock and AUHostMusicalContextBlock so that the AU can read the state and context and output MIDI to the host. If I make my AU a kAudioUnitType_MusicDevice (aumu), I do receive note events from the music track (even if I cannot connect my AU to the sampler as it's not a MIDI processor). But with a MIDI processor, I don't. Any clue why this is?
0
0
1.2k
Mar ’22