I've managed to get an engine set up to play a sine wave at a given frequency and amplitude. Now I need to play a sequence of notes using that tone. The sequence will always be the same, so I could read it from a midi file. I found sound sample code to load and play a midi file, but it's not working. When I call the AVAudioSequencer's prepareToPlay method, it throws an exception, saying there's no AU connected to the track. I can't figure out how to make that connection.Perhaps the midi file won't work for this. I quickly tapped some notes into a track in GarageBand, saved it as a loop, then converted the loop to midi with a tool a found online.Any hints or good sample code for doing this? Is there a not-too-complicated way to punch in the midi events programatically instead of loading it from midi?
Post
Replies
Boosts
Views
Activity
The newest version of my CoreData model requires a somewhat lengthy migration. If documents are large enough, a progress window should be shown. That's fine when the app is running and the user opens an old document. But when the state of the app is being restored, and any of the restored documents need to be migrated, changes to the app's window state appears to be frozen or something, because no amount of window manipulation I've tried during this time every shows up onscreen. The result is a long migration process with no user-visible activity in the app because. This is unacceptable.Any ideas about how to show some UI during state restoration? NSUserNotification only works if the app is in the background, right? And it's not in the background after the user launches it, so the notification never appears.
I'm trying to get an NSMetadataQuery to work using an NSPredicate build with:
[NSPredicate predicateFromMetadataQueryString:@"InRange(kMDItemFSCreationDate,$time.today,$time.today(+1))"]
That string is the one I copied from Finder's Get Info window after saving a search of "Created is Today". It at least parses and runs without exceptions, but the results are totally wrong. It only returns files created on 12/31/2000.
I've tried dozens of different attempts at building a predicate that uses these $time functions, but they all fail parsing and throw an exception.
What I really need to be able to do is build the predicate with a format string, because the attribute is stored in a property:
[NSPredicate predicateWithFormat:@"InRange(%K,$time.today,$time.today(+1))", self.attr];
That throws an exception.
[NSPredicate predicateFromMetadataQueryString:[NSString stringWithFormat:@"InRange(%@,$time.today,$time.today(+1))", self.attr]];
That does not throw an exception, but it gives the same 12/31/2000 results.
I have a camera app. The camera controller is based on Apple's AVCam sample project, which uses AVFoundation. My first release requested the front AVCaptureDeviceTypeBuiltInWideAngleCamera as the preferred device type. This worked fine.
Now I want to leverage the SemanticSegmentationMatteTypes available with the TrueDepth camera, so I changed the preferred device type to AVCaptureDeviceTypeBuiltInTrueDepthCamera. It finds one of those on my iPad. Early in development it was working fine, delivering the image for the semantic matte for skin, hair, and teeth. I pared this down to only the skin image. It still worked fine.
Then later in the day, using the same iPad as the test device, it started failing with:
Capture session runtime error: Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}
What's this cryptic error supposed to mean? It prevents the camera from working.
I'm also using JPSVolumeButtonHandler to trigger the camera with the volume buttons. After switching to the TrueDepth camera, I also started getting these errors in the log:
[avas] AVAudioSessionUtilities.h:106 AudioSessionGetProperty ('rcid') failed with error: '!ini'
I wondered if somehow AVFoundation thought the TrueDepth camera required audio, and it was fighting with the JPSVolumeButtonHandler, so I commented out the JPSVolumeButtonHandler. The AudioSessionGetProperty error goes away, but the other error remains, it repeats every 5 seconds.
After picking Automatic code signing, it will crash while packaging the app.
Anybody else?
Pertinent system prefs:
[X] Ask to keep changes when closing documents
[ ] Close windows when quitting an app
App setup:
autosavesInPlace returns true
I've run my app many times and have one test doc that I use each time I run via it being automatically reopened. Recently I created a couple new documents, then closed them before quitting and used the Delete button to dispose of them. Yet each time I run, all the supposedly deleted docs reappear. They're not being deleted from "/Users//Library/Containers//Data/Library/Autosave Information/". Why?
In the iOS system prefs->Accessibility->Keyboards, there's the switch for "Show Lowercase Keys", which allows keyboards to show upper- and lower-case key caps. I can't find any mention of an accessor for that value. It doesn't show in NSUserDefaults when I dump them, nor does any header mention it. I'd like to be able to use it in my custom keyboard extension.
I have an iOS app which contains a custom keyboard that can be used by any app. They share a few prefs by using an app group and a named suite of NSUserDefaults. Each target (app, keyboard) keep their own instance of the shared prefs by storing it in a property:
self.sharedPrefs = [[NSUserDefaults alloc] initWithSuiteName:kPrefsSuiteName];
And then add observers for a couple different prefs:
[self.sharedPrefs addObserver:self forKeyPath:kPrefRecents options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld context:nil];
When running in the Simulator, both the app and keyboard can write to the shared prefs and both target's observers correctly respond to changes. But when running on a device (iPhone X, iPad Pro 11"), the app can write to the prefs, but the keyboard fails with:
2023-04-17 14:20:20.095600-0500 projectname[561:23591] [User Defaults] Couldn't write values for keys (
Recents
) in CFPrefsPlistSource<0x2807dc500> (Domain: group.com.my.key, User: kCFPreferencesCurrentUser, ByHost: No, Container: (null), Contents Need Refresh: No): setting preferences outside an application's container requires user-preference-write or file-write-data sandbox access
I also noticed this when the app adds its first key value observer to the shared prefs:
2023-04-17 14:57:36.610366-0500 projectname[820:45796] [User Defaults] Couldn't read values in CFPrefsPlistSource<0x280bc1c80> (Domain: group.com.my.key, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd
This is supposed to work. What's wrong here?
The list of categories on the left (and can be customized with the … button at the top-left): Is there any API to get that list of category names and the code points they encompass? They're different than the standard Unicode code point blocks? If so, is it available on iOS?
These categories don't appear to be available from any of the unicode.org documents I've seen, but then they host a ton of data, and most of it is beyond my need.
I have an iOS/iPadOS app that has the ability to open files by using the Share action in the Files app. The app receives the application:openURL:options: method, but when it tries to access the given url, it produces this error:
Error Domain=NSCocoaErrorDomain Code=257 "The file “Zippy.cbz” couldn’t be opened because you don’t have permission to view it." UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/B1C0C77A-2E86-4BD7-9344-845D6DA6FA74/File%20Provider%20Storage/Zippy.cbz, NSFilePath=/private/var/mobile/Containers/Shared/AppGroup/B1C0C77A-2E86-4BD7-9344-845D6DA6FA74/File Provider Storage/Zippy.cbz, NSUnderlyingError=0x283c6d860 {Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"}}
This used to work in earlier versions of iPadOS. It's currently running 16.5.1 ©. I see "AppGroup" in the path. Does that mean that something thinks my app is using a shared app group? It's not. Does it need to now in order to receive a shared file?
Is it possible to add ImageAnalysis to a UIImageView from my Objective-C app? I went through the work of adding VNRecognizeTextRequest code to analyze the image and hilite the found text rects, but was unsure how to add user interaction with the found text ranges. Then I noticed a web page that described ImageAnalysisInteraction, but it's Swift only, and I don't know how to get to that from Objective-C.
I'd love to have actual text selection ranges like you get from this, along with the text actions popover, rather than just hilited rects.
I'm opening a new window and setting its contentView to an NSHostingView, to which I give my SwiftUI view. This view can edit certain things, and I want to be able to ask the user if they want to save their changes when they close the window. How can this be accomplished?
To me, the documentation for this method is not at all clear and needs to provide sample code. I tried searching the dev site for sample projects, but they're no longer where they lived for decades before. And where's the TextEdit sample project?!
I need to override shouldCloseWindowController so I can make sure some sub-windows can be closed before closing the document's one and only window. I've done that, but then have no idea what I'm supposed to do to all the document to close. I tried this, which is what I think the NSDocument version header says to do:
if ((self.windowControllers.first?.shouldCloseDocument) != nil){
self.canClose(withDelegate:delegate as Any, shouldClose:shouldCloseSelector, contextInfo:contextInfo);
}
But the document never gets the close() method called - only the window goes away.
I have a document class that makes a main window for showing the data. Pieces of that data can be opened in separate subwindows for editing. When the user closes the main window and the document is dirty and the subwindows are dirty, I would like to present UI that asks the user if they want to save the changes in the subwindows (and possibly the main window if I decide to turn off autoSavesInPlace for the document).
I've tried a number of possible methods of doing this, but always run into some roadblock:
-Overriding shouldCloseWindowController:delegate:shouldCloseSelector:contextInfo: where I would go through the open subwindows and asking and tell them to do their own UI for asking if they should be saved. This is no good because everything returns back to the run loop and the doc would close, leaving the subwindows open with their Save? sheets up.
-Making the subwindows inherit from NSEditor and registering them with the document. This looked like it would work, but it caused an infinite loop in my override of commitEditingWithDelegate:didCommitSelector:contextInfo: that I don't understand. I'm probably calling the didCommitSelector wrong because the docs aren't clear and provide no example.
-Adding each subwindow's NSWindowController to the document's windowControllers list. I don't recall the problems this caused.
Any sage advice about this? Possible examples? The Document is Swift, the subwindows are Cocoa, so examples in either language is fine.
It fails in a sandboxed app. I found a couple suggestions. One was to add a NSAppleEventsUsageDescription pair to the Info.plist so the user would be asked to grant permission for AppleEvents. But that never happens for showSearchResultsForQueryString.
The next was to add the com.apple.security.temporary-exception.apple-events entitlement and provide com.apple.finder as the app. This DOES work, but Apple is rejecting my app because of it, even though I've said it's the only way to make showSearchResultsForQueryString work. I'm still waiting for them to tell me how to do it in a more correct way. This is obviously a bug, because an app should be able to use any NSWorkspace method without jumping through security hoops.
Has anyone else found a way to make it work and get their app on the App Store?