Post

Replies

Boosts

Views

Activity

After update to XCode 15.3 can't run app on My Mac (Designed for iPad) - error: attach by pid 'xxxxx' failed -- attach failed
I can run it on the simulators, and directly on physical devices, but not on My Mac (Designed for iPad). The error is: error: attach by pid '33190' failed -- attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.) Checked the log and this is what I found: error 12:44:36.527139-0500 mdbulkimport could not make proxies from uuids in optimized path! Error Domain=NSOSStatusErrorDomain Code=-10814 "Unable to find this application extension record in the Launch Services database." UserInfo={_LSFunction=, _LSLine=679, NSDebugDescription=Unable to find this application extension record in the Launch Services database., SK=, IS=0} error 12:44:36.527174-0500 mdbulkimport Using expensive fallback path for obtaining plugin proxies from install notifications. This process should be entitled to use the LS database. default 12:44:36.529748-0500 debugserver [LaunchAttach] (32518) about to task_for_pid(32517) default 12:44:36.529778-0500 kernel macOSTaskPolicy: (com.apple.debugserver) may not get the task control port of (Piano Motifs) (pid: 32517): (Piano Motifs) is hardened, (Piano Motifs) doesn't have get-task-allow, (com.apple.debugserver) is a declared debugger(com.apple.debugserver) is not a declared read-only debugger error 12:44:36.529795-0500 debugserver error: [LaunchAttach] MachTask::TaskPortForProcessID task_for_pid(32517) failed: ::task_for_pid ( target_tport = 0x0203, pid = 32517, &task ) => err = 0x00000005 ((os/kern) failure) So the issue seems to be not finding the application's extension record in the Launch Services database. How is this problem solved?
3
0
1.6k
May ’24
MIDI file generated by using MusicSequenceFileCreate has Sysex MIDI message (iOS 16.0.2)
I am using the MusicSequenceFileCreate method to generate a MIDI file from a beat-based MusicSequence. In iOS 16.0.2 devices, the file that is created has a Sysex MIDI message added (not by me) to the file at time 0: f0 2a 11 67 40 40 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ff ff ff ff 00 00 00 00 00 00 00 00 00 00 f7 Sysex messages are manufacturer dependent, a file with this Sysex message can't be read into apps like Nanostudio, Ableton, Zenbeats. It can be read by GarageBand. My app's deployment target is iOS 13.0. Has anybody else ran into this issue? Thanks
1
0
1k
Oct ’22
Interface Builder doesn't honor Safe Area proportional alignment constraints
In Interface Builder, setting the constraints for a UI element (let's say a button) doesn't change if I make the aligment proportional to the Safe Area or proportional to the Superview. I have a button which I set its horizontal alignment to be:  I have another button which I set its horizontal alignment to be:  Both buttons end up being aligned horizontally: I would have expected the button aligned to the Safe Area to be shifted to the right as the Safe Area's leading edge is shifted to the right from the one of the Superview. I'm probably missing something but can't quite understand what is going on here. The problem is that heights and widths proportional to the Safe Area are honored, so the size of UI elements does change if you make them proportional to the Safe Area or to the Superview. So when you try to layout something with Safe Area proportional heights and widths, and also use Safe Area proportional horizontal and vertical placements, UI Elements don't line up for iPhones with a notch. They kind of lineup for devices like iPads and iPhones with no notch where the Safe Area is very close to the Superview area.
2
0
1.4k
Mar ’22
Programatic use of Light Probes
When rendering a scene using environment lighting and the physically based lighting model, I have a need for an object to reflect another object. As I understand it, in this type of rendering, reflections are only based on the environment lighting and nothing else. As a solution I was intending to use a light probe placed between the object to be reflected and the reflecting object. My scene has been developed programatically and not through an XCode scene file. From Apple's WWDC 2016 presentation on SceneKit I could gather that light probes can be updated programatically through the use of the updateProbes method of the SCNRenderer class. I have the following code, where I am trying to initialize a light probe by using the updateProbes method:let sceneView = SCNView(frame: self.view.frame) self.view.addSubview(sceneView) let scene = SCNScene() sceneView.scene = scene let lightProbeNode = SCNNode() lightProbe = SCNLight() lightProbeNode.light = lightProbe lightProbe.type = .probe scene.rootNode.addChildNode(lightProbeNode) var initLightProbe = true func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) { if initLightProbe { initLightProbe = false let scnRenderer = SCNRenderer(device: sceneView.device, options: nil) scnRenderer.scene = scene scnRenderer.updateProbes([lightProbeNode], atTime: time) print ("Initializing light probe") } }I don't seem to get any light coming from this light probe. My question is simple, can the updateProbes method be used to initialize a lightProbe? If not, how can you initialize a light probe programatically?
1
0
1.1k
Mar ’18
Timestamping in AUv3 MIDI plug-in
I am trying to understand how timestamping works for an AUv3 MIDI plug-in of type "aumi", where the plug-in sends MIDI events to a host. I cache the MIDIOutputEventBlockand the transportStateBlock properties into _outputEventBlock and _transportStateBlock in the allocateRenderResourcesAndReturnError method and use them in the internalRenderBlockmethod: (AUInternalRenderBlock)internalRenderBlock { 		 		// Capture in locals to avoid Obj-C member lookups. If "self" is captured in render, we're doing it wrong. See sample code. 	 		return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { 				// Transport State 				if (_transportStateBlock) { 						AUHostTransportStateFlags transportStateFlags; 						_transportStateBlock(&transportStateFlags, nil, nil, nil); 						 						if (transportStateFlags & AUHostTransportStateMoving) { 								if (!playedOnce) { 										// NOTE On! 										unsigned char dataOn[] = {0x90,69,96}; 										_outputEventBlock(timestamp->mSampleTime, 0, 3, dataOn); 										playedOnce = YES; 										 										// NOTE Off! 										unsigned char dataOff[] = {0x80,69,0}; 										_outputEventBlock(timestamp->mSampleTime+96000, 0, 3, dataOff); 								} 						} 						else { 								 								playedOnce = NO; 								 						} 				} 				 				return noErr; 		}; } What this code is meant to do is to play the A4 note in a synthesizer at the host for 2 seconds (the sampling rate is 48KHz). What I get is a click sound. Experimenting some, I have tried delaying the start of the note on MIDI event by offsetting the _outputEventBlock AUEventSampleTime, but it plays the click sound as soon as the play button is pressed on the host. Now, if I change the code to generate the note off MIDI event when the _transportStateFlags are indicating the state is "not moving" instead, then the note plays as soon as the play button is pressed and stops when the pause button is pressed, which would be the correct behavior. This tells me that my understanding of the AUEventSampleTime property in MIDIOutputEventBlock is flawed and that it cannot be used to schedule MIDI events for the host by adding offsets to it. I see that there is another property scheduleMIDIEventBlock, and tried using this property instead but when I use it, there isn't any sound played. Any clarification of how this all works would be greatly appreciated.
0
0
883
Jan ’21
Access to host MIDIOutputEventBlock in allocateRenderResources method of AUv3 app
I am using the AUv3 template that gets created by XCode to implement a MIDI AUv3 plugin of type "aumi". For the plugin to be able to send MIDI to a host it needs to be able to access the MIDIOutputEventBlock provided by the host. I have done some research and found that this is done by caching the MIDIOutputEventBlock in the allocateRenderResourcesAndReturnError method: _midiOut = self.MIDIOutputEventBlock; and then using \_midiOut in the internalRenderBlock method. The first thing is that in the template that gets created isn't an allocateRenderResourcesAndReturnError method, there is only an allocateRenderResources method. When I apply that code in this method I get a compile error that basically says that this property is not found in the object of type xxxDSPKernelAdapter. I've seen in other examples (like Gene de Lisa's "Audio Units (AUv3) MIDI extension", a wonderful tutorial, by the way!) that the initial template from a couple years ago was very different from what I have and that MIDIOutputEventBlock is actually defined in the AUAudioUnit.h header file, but in this case self is also a different class. I am very new at working with Objective-C, C++ and Swift in the same project, so I know my understanding of how this all works is minimal and very shallow. Any insight anybody could provide on this would be greatly appreciated.
1
0
1k
Jan ’21
Audio from a file generated from an AVAudioEngine bus tap has different speed than the one played on device output
I am generating audio with an AVAudioEngine. I am then installing a tap on the mainMixerNode output of the engine which provides an AVAudioPCMBuffer which is then written into an MPEG4ACC AVAudioFile. The input Audio nodes to the engine are AVAudioUnitSampler nodes. The issue I have is that the audio played on the resulting .m4a file is slower in speed than the one that you can hear on the device output itself (speakers, headphones). This is the code I am implementing:// Audio Format let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2) // Engine var engine = AVAudioEngine() // Two AVAudioNodes are hooked up to the AVAudioEngine engine.connect(myAVAudioNode0, to: engine.mainMixerNode, format: audioFormat) engine.connect(myAVAudioNode1, to: engine.mainMixerNode, format: audioFormat) // Function to Write Audio to a File func writeAudioToFile() { // File to write let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let audioURL = documentsDirectory.appendingPathComponent("share.m4a") // Format parameters let sampleRate = Int(audioFormat!.sampleRate) let channels = Int(audioFormat!.channelCount) // Audio File settings let settings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: Int(audioFormat!.sampleRate), AVNumberOfChannelsKey: Int(audioFormat!.channelCount), AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue ] // Audio File var audioFile = AVAudioFile() do { audioFile = try AVAudioFile(forWriting: audioURL, settings: settings, commonFormat: .pcmFormatFloat32, interleaved: false) } catch { print ("Failed to open Audio File For Writing: \(error.localizedDescription)") } // Install Tap on mainMixer // Write into buffer and then write buffer into AAC file engine.mainMixerNode.installTap(onBus: 0, bufferSize: 8192, format: nil, block: { (pcmBuffer, when) in do { try audioFile.write(from: pcmBuffer) } catch { print("Failed to write Audio File: \(error.localizedDescription)") } }) }
2
0
3.1k
Mar ’20
Deprecated functions in CoreMIDI framework - Swift
I am trying to implement MIDI Out for an app. The app will generate a MIDI sequence that could be directed to be played on another app (synthesizer or as an input to apps like AUM). I noticed that certain functions in the CoreMIDI framework like MIDIReceived and MIDISend have been deprecated. I couldn't find any new functions that replace these. CoreMIDI documentation is very sparse and lacking. Anybody know of new functions that replace these?
2
0
955
Jun ’20
Connecting a MIDI Endpoint Destination Ref to a MIDI Output port in iOS using Swift
I am setting up an app to send MIDI data to another app using the CoreMIDI framework. I will not be using the AudioKit framework in this app. var destRef = MIDIEndpointRef() destRef = MIDIGetDestination(destIndex) // Create Client var midiClientRef = MIDIClientRef() MIDIClientCreate("Source App" as CFString, nil, nil, &midiClientRef) // Create MIDI Source Endpoint Ref var virtualSrcEndpointRef = MIDIEndpointRef() MIDISourceCreate(midiClientRef, "Source App Endpoint" as CFString, &virtualSrcEndpointRef) // Create MIDI Output port var outputPortRef = MIDIPortRef() MIDIOutputPortCreate(midiClientRef, "Source App Output Port" as CFString, &outputPortRef) After that I use the MIDIReceived function to send MIDI packets to the Source Endpoint. This works, but the issue is that if there are several destination apps open, the MIDI gets played in all of them. This makes sense because there isn't an explicit connection between the client's output port and the destination endpoint. When it is the opposite, when you create a Destination Endpoint and you are receiving MIDI there is a function called MIDIPortConnectSource which establishes a connection from a source to a client's input port. I cannot find an equivalent MIDIPortConnectDestination in the CoreMIDI MIDI services APIs. How does one make that direct connection? Again, I will not be using AudioKit in this app.
1
0
1.3k
Jul ’20
Using AVAudioRecorder to record the output of the mainMixerNode of an AVAudioEngine instance
I am trying to use the AVAudioRecorder method to record the output of the mainMixerNode of an AVAudioEngine instance and save it to an MPEG4ACC file. From what I have been reading the default input to AVAudioRecorder is the microphone. I have everything setup so I can record to a file but how can I change the AVAudioRecorder input to be the mainMixerNode output?
1
0
1k
Mar ’20
Acceptance of Game Center Challenges
I have been successful at issuing score challenges in Game Center by using the challengeComposeController method of GKScore. This GKScore object has a context property which holds a specific seed to be used to start a specific game when the challenged player accepts the challenge. My question is, when the challenged player presses the Play Now button in the Challenges screen of the Game Center View Controller, how can the game view controller know that the player accepted the challenge and which challenge he/she accepted when the GKGameCenterViewController is dismissed? I know there is a GKLocalPlayerListener protocol with different methods to manage challenges but it isn't very well documented when these methods fire or should be used.
0
0
724
Jan ’20