Post

Replies

Boosts

Views

Activity

AVAudioEngine connectMIDI with eventListBlock always sends MIDI 2.0 events
I connect two AVAudioNodes by using - (void)connectMIDI:(AVAudioNode *)sourceNode to:(AVAudioNode *)destinationNode format:(AVAudioFormat * __nullable)format eventListBlock:(AUMIDIEventListBlock __nullable)tapBlock and add a AUMIDIEventListBlock tap block to it to capture the MIDI events. Both AUAudioUnits of the AVAudioNodes involved in this connection are set to use MIDI 1.0 UMP events: [[avAudioUnit AUAudioUnit] setHostMIDIProtocol:(kMIDIProtocol_1_0)]; But all the MIDI voice channel events received are automatically converted to UMP MIDI 2.0 format. Is there something else I need to set so that the tap receives MIDI 1.0 UMPs? (Note: My app can handle MIDI 2.0, so it is not really a problem. So this question is mainly to find out if I forgot to set the protocol somewhere...). Thanks!!
0
0
564
May ’24
Map ARSkeleton3D of ARBodyAnchor to normalized camera image
I am trying to map the 3D skeleton joint positions of an ARBodyAnchor to the real body on the camera image. I know I could simply use the "detectedBody" of the ARFrame, which would already deliver the normalized 2D position of each joint, but what I am mostly interested in is the z-axis (the distance of each joint to the camera). I am starting a ARBodyTrackingConfiguration, setting the world alignment to ARWorldAlignmentCamera (in which case the camera transform is an identity matrix) and multiplying each joint transform in model space (via modelTransformForJointName:) with the transform of the ARBodyAnchor. And then tried many different ways to get the joints to line up with the image, by for example multiplying the transforms with the projectionMatrix of the ARCamera. But whatever I do, it never lines up correctly. For example, the doesn't really seem to be a scale factor in the projectionMatrix or the ARBodyAnchor transform, no matter the distance of the camera to the detected body, the scale of the body is always the same. Which means I am missing something important, and I haven't figured out what. So does anyone have an example of how I can get the body align to the camera image? (or get the distance to each joint in any other way?) Thanks!
1
0
597
May ’24
How to import a file with a custom file type into an app from the web?
I have created a custom file type for my app, that conforms to XML. Now I am trying to find a way so that users can share the file online and import it from there into the app. I have added an UTExportedTypeDeclarations and CFBundleDocumentTypes for my custom type to info.plist, and also added LSSupportsOpeningDocumentsInPlace and UIFileSharingEnabled. What works so far is that I can share a file via UIActivityViewController and send by mail, and when I tap the file in the E-Mail, the app opens and imports the file. But what I haven't managed to get working so far is to import a file from a website. Safari will automatically detect that it is an XML file, ignore my own file extension and show the file in Safari. As XML. The only way to get the file open in the app is if I long-press a link to a file, then tap on "Download linked file". This will send the file to the downloads folder and add ".xml" to the downloaded file. Now if I also add "public.xml" to LSItemContentTypes, it will open the file in the app. But this is of course not very user friendly, and it will open any XML-files in my app, which is not what I want. Is there any other way on iOS that users can share app files online and directly import them into an app? Or am I missing a setting somewhere? Here an example of how my CFBundleDocumentTypes is set up: <key>CFBundleDocumentTypes</key>     <array>       <dict>         <key>CFBundleTypeName</key>         <string>My own project file</string>         <key>LSHandlerRank</key>         <string>Owner</string>         <key>LSItemContentTypes</key>         <array>           <string>com.mycompany.myfiletype</string>           <string>public.xml</string>         </array>       </dict>     </array> And the exported UTExportedTypeDeclarations: <key>UTExportedTypeDeclarations</key>     <array>       <dict>         <key>UTTypeConformsTo</key>         <array>           <string>public.xml</string>         </array>         <key>UTTypeDescription</key>         <string>My own project file</string>         <key>UTTypeIconFiles</key>         <array />         <key>UTTypeIdentifier</key>         <string>com.mycompany.myfiletype</string>         <key>UTTypeTagSpecification</key>         <dict>           <key>public.filename-extension</key>           <array>             <string>myfiletype</string>           </array>         </dict>       </dict>     </array>
0
0
1.5k
Jun ’21
MIDINetworkSession: Is it possible to send MIDI events to specific connections?
Is there a way to send MIDI events only to one specific connection when using MIDINetworkSession on iOS, or will all MIDI events be sent to all connections? The documentation for MIDINetworkSession seems to be kind of nonexistent (or is there something better than the framework documentation somewhere?). What I see is that MIDINetworkSession only has one default session, which can have multiple connections which each have different ports, but only one "destinationEndpoint"... The reason I am asking this is that I created my own RTP MIDI implementation, but want to allow the user to use the official Apple framework as well. So I know that it is possible from the RTP MIDI specification to send MIDI events to only one destination port. But I haven't found out how to do that with MIDINetworkSession or if it is even possible... Thanks!
0
1
635
Sep ’20
How to simulate app termination?
While implementing the background behaviour of an app, I learned that there is a difference between the following app states:-being suspended-being terminated by the system-being terminated by the user (force quit)I can simulate app suspension by pushing the home button and moving the app to the background. I can force quit the app by double-tapping the home button and flicking the app out of the screen in the app switcher.But how can I reliably simulate the app being terminated by the system?I need this to test a few background features that seem to behave differently when the app is terminated by the system or while it is still suspended (and do not do anything while the app is force quit).Thanks!
13
0
12k
Nov ’17
Return a BOOL value and throw an error, and convert this nicely to Swift
We have the (usual?) problem that we need a method that returns a BOOL and can throw an error, inside a Framework that can be used from Objective-C and from Swift. Is there a nice and for the user understandable way to do that?More detail:If I just needed a method that returns a BOOL, that would look like this:- (BOOL)isDoneNow (as you, who will hopefully answer this question, will know) if this should return an error at the same time, I can't do it like this, as in:- (BOOL)isDone:(NSError**)errorthe return value tells the user if there was an error, but doesn't return any value.So the usual ways I have found so far to do this are these:Send the return value as a pointer and fill it:-(BOOL)getIsDone:(BOOL*)isDone error:(NSError**)erroror: Return an NSNumber that is nil if there is an error or contains a BOOL if not.-(NSNumber*)isDone:(NSError**)errorIn Objective-C, we so far used the first solution for this. But now the framework needs to nicely map to a swift funtion. Now in Swift, the first solution would map to:func getIsDone(isDone: UnsafeMutablePointer&lt;ObjCBool&gt;) throwsand the second one to:func isDone() throws -&gt; NSNumberwhich is really unclear for the user why this is an NSNumber and he has to use .boolValue to actually use it...As you can see, both variants are really bad. What I would prefer is something like:func isDone() throws -&gt; Boolor maybefunc getIsDone(isDone: inout Bool) throwsIs there a way to tell the clang compiler to convert it to something like this? Pleeease?Thanks!
5
0
5.7k
Nov ’17