Using GPTK beta3, when launching steam from Sonoma b5 VM (launched from latest UTM 4.3.5) is says :
D3DM: D3DMetal requires Apple silicon and macOS 14.0 Sonoma or higher
command used to launch steam:
gameportingtoolkit ~/my-game-prefix 'C:\Program Files (x86)\Steam\steam.exe'
GPTK was compiled/installed fine using x86 homebrew and Xcode 15b6 command line tools.
gameportingtoolkit have been copied to /usr/local/bin as to unmount GPTK image.
On a M2 Pro 12 CPU/19GPU Mac mini, 32GB. (8 performance core and 20 GB ram allocated to the VM)
Post
Replies
Boosts
Views
Activity
I created a multi-timbral instrument application based on multiple AVAudioUnitSampler instances (one per Midi channel), wrapped in a custom AVSampler class.
I want to expose it also as an AUv3. I followed some articles and samples and put the view controller and other classes in a Framework target, created an AudioUnit extension target (with a dummy/empty class file as I've no implementation to provide).
In the extension's Info.plist (NSExtensionAttributes) I added AudioComponentBundle (points to the AUFramework) and AudioComponents item with factoryFunction (points to $(PRODUCT_MODULE_NAME).MultiSamplerViewController), aumu type. Also added NSExtensionPrincipalClass pointing to AUFramework.MultiSamplerViewController.
In the shared MultiSamplerViewController I implemented
(AUAudioUnit *)createAudioUnitWithComponentDescription:(AudioComponentDescription)desc error:(NSError **)error {
return [[[multiSampler engine] outputNode] AUAudioUnit];
}
It also contains an - (id)initWithCoder:(NSCoder*)decoder method, that instantiates the wrapping MultiSampler and starts an enclosed MidiManager.
The host application target runs fine, however the AU extension plugin isn't listed in GarageBand (even after runnig the host application once). The target platform is iPad.
I added code to load the appex plugin bundle, however it doesn't seem enough to register the plugin. Also I cannot use the AUAudioUnit registerSubclass as I've no concrete AU implementation class (I could pass [[[multiSampler engine] outputNode] AUAudioUnit] ?)
I'm in the same configuration as an application built on AudioKit framework (that originally wrapped AVAudioUnitSampler - and now uses a custom implementation).
I found information about creating a network session in Audio&Midi utility and click connect once the iOS Simulator appears in Directory. Il left the port unchanged (5004).
In my App code I created an input port and linked it to the default network session :
MIDIPortRef inPort = 0;
MIDIInputPortCreate(client, CFSTR("Input port"), MyMIDIReadProc, (__bridge void * _Nullable) player, &inPort);
MIDINetworkSession* session = [MIDINetworkSession defaultSession];
session.enabled = YES;
session.connectionPolicy = MIDINetworkConnectionPolicy_Anyone;
MIDIPortConnectSource(inPort, session.sourceEndpoint, (__bridge void * _Nullable) player);
When I send midi messages (fom Logic using Network session 1 as Port) the ios app midi handler is not called.
Some say we have to create an ouput port however I don't think it is related
I created a multitimbral sampler based on 16 instances of AVAudioUnitSampler (one per MIDI channel/part).
It plays fine when receiving MIDI messages (or using the application virtual keyboard) on a single part.
However I tried to play a Midifile using AVAudioSequencer (assiging each AVMusicTrack destination to the corresponding AVAudioUnitSampler instance). It uses 3 parts, one with a pad (samples up to 30s - 100MB total for the aupreset, many simultaneous notes - up to 6 - with many sustain messages), one with a bass sound (a single note at a time, 10MB size aupreset), and one with a lead sound (also one note at a time).
Some notes are cut before end or do not play (mainly for the third part), as if there weren't resources left. I'm using the Simulator, and can't try anymore on my real iPad (won't boot since and will need repair or replacement).
The Xcode monitoring tab shows only 2 to 3 percent processor used (and 60 MB memory used). However the Simulator runs on an old mac (mid-2010 mac mini - Core2Duo 2,4 Ghz).
Is AVAudioUnitSampler suited to be using such way, or should I subclass AVAudioUnitMIDIInstrument (creating an audiounit with kAudioUnitSubTypeMIDISynth subtype as detailed in Gene De Lisa blog post, and loading a soundfont bank using kMusicDevicePropertySoundBankURL) ? Then the only way to change a part instrument would be to send a program change to the AVAudioUnitMIDIInstrument subclass ? I don't know how.
Or should I used the kAudioUnitSubType_DLSSynth ?
I want to save a configuration file (property list / NSDictionary) to the application's local Documents folder (allowing user to select new filename) :
NSURL *temporaryDirectoryURL = [NSURL fileURLWithPath:NSTemporaryDirectory() isDirectory: YES];
NSURL *tempUrl = [temporaryDirectoryURL URLByAppendingPathComponent:@"tempPerf.plist"];
[self.partsConfigs writeToURL:tempUrl atomically:YES];
UIDocumentInteractionController *interactionController = [UIDocumentInteractionController interactionControllerWithURL:tempUrl];
[interactionController setDelegate:self];
[interactionController setUTI:@"com.apple.property-list"];
[interactionController setName:tempUrl.lastPathComponent]
[interactionController presentOptionsMenuFromRect:self.view.frame inView:self.view animated:YES];
However the controller's view doesn't show and logs state about some private path :
[MC] System group container for systemgroup.com.apple.configurationprofiles path is /Users/.../Library/Developer/CoreSimulator/Devices/.../data/Containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
[MC] Reading from private effective user settings.
I also tried using another public temp path (and get same logs)
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *path = [documentsDirectory stringByAppendingPathComponent:@"tempPerf2.plist"];
NSURL *tempUrl = [NSURL fileURLWithPath:path];
I try to save a configuration file (is a property-list / NSDictionary) to the local application's documents folder.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *path = [documentsDirectory stringByAppendingPathComponent:@"perf.plist"];
[self.partsConfigs writeToFile:path atomically:YES]; // partsConfigs is an NSDictionary
NSURL *url = [NSURL fileURLWithPath:path];
UIActivityViewController *activityViewController = [[UIActivityViewController alloc] initWithActivityItems:[NSArray arrayWithObjects:url, nil] applicationActivities:nil];
activityViewController.popoverPresentationController.sourceView = self.view;
[self presentViewController:activityViewController animated:YES completion:nil];
I see the popover activity view to select destination, however I don't see the local application's documents folder.
And if I click to the option "Saves to files" I get this error (however the file corresponding to the passed url still exists)
[ShareSheet] cancelled request - error: The operation couldn’t be completed. Invalid argument
I also would like to be able for the user to select/change the file name
My application is writing a .plist config file to Documents folder (in Library/CoreSimulator/Devices/<ID>/data/Container/Data/Application/<ID>
The file then didn't show when using UIDocumentPickerViewController :
UIDocumentPickerViewController *documentPicker = [[UIDocumentPickerViewController alloc] initWithDocumentTypes:@[@"public.text"] inMode:UIDocumentPickerModeOpen];
documentPicker.delegate = self;
documentPicker.modalPresentationStyle = UIModalPresentationFormSheet;
[self presentViewController:documentPicker animated:YES completion:nil];
I also tested using "com.apple.property-list" as documents types filter (and UIDocumentPickerModeImport)
Hi,
I'm developing an AUSampler based application for iOS12.4x.
It manages incoming MIDI messages through the MIDIReadProc handler. Initially I sent notifications from this handler to have it respond faster and allow uncoupling with the AU rendering thread.
However I noticed stuck notes (as if some note off messages werent't handled) when playing many note repeatedly at the same time.
I then called MusicDeviceMIDIEvent directly from MIDIReadProc handler, however it didn't fix the problem.
It seems the AU Sampler cannot manage such load ?
Is there a new AUv3 sampler unit available that could help ?
I'm on High Sierra so I can't develop for iOS 13+.
I both manage noteoff (0x8) and 0 velocity note on (0x9) messages to stop playing (calling MusicDeviceMIDIEvent with 0 velocity value).