I'm struggling to create a simple document app on macOS that can open a directory (and display content found there).
I created a new project in Xcode 15 using the "Document App" template and then just changed the readableContentTypes var like this:
struct DocumentDemoDocument: FileDocument {
//...
static var readableContentTypes: [UTType] { [.directory, .folder] }```
//...
}
But, when I run that app and choose File->Open..., the Open button is still grayed out when a directory is selected. What did I miss here?
Post
Replies
Boosts
Views
Activity
I want the audio session to always use the built-in microphone. However, when using the setPreferredInput() method like in this example
private func enableBuiltInMic() {
// Get the shared audio session.
let session = AVAudioSession.sharedInstance()
// Find the built-in microphone input.
guard let availableInputs = session.availableInputs,
let builtInMicInput = availableInputs.first(where: { $0.portType == .builtInMic }) else {
print("The device must have a built-in microphone.")
return
}
// Make the built-in microphone input the preferred input.
do {
try session.setPreferredInput(builtInMicInput)
} catch {
print("Unable to set the built-in mic as the preferred input.")
}
}
and calling that function once in the initializer,
the audio session still switches to the external microphone once one is plugged in.
The session's preferredInput is nil again at that point, even if the built-in microphone is still listed in availableInputs.
So,
why is the preferredInput suddenly reset?
when would be the appropriate time to set the preferredInput again?
Observing the session’s availableInputs did not work and setting the preferredInput again in the routeChangeNotification handler seems a bad choice as it’s already a bit too late then.
From an app that reads audio from the built-in microphone, I'm receiving many crash logs where the AVAudioEngine fails to start again after the app was suspended.
Basically, I'm calling these two methods in the app delegate's
applicationDidBecomeActive and
applicationDidEnterBackground
methods respectively:
let audioSession = AVAudioSession.sharedInstance()
func startAudio() throws {
self.audioEngine = AVAudioEngine()
try self.audioSession.setCategory(.record, mode: .measurement)}
try audioSession.setActive(true)
self.audioEngine!.inputNode.installTap(onBus: 0, bufferSize: 4096, format: nil, block: { ... })
self.audioEngine!.prepare()
try self.audioEngine!.start()
}
func stopAudio() throws {
self.audioEngine?.stop()
self.audioEngine?.inputNode.removeTap(onBus: 0)
self.audioEngine = nil
try self.audioSession.setActive(false, options: [.notifyOthersOnDeactivation])
}
In the crash logs (iOS 16.6) I'm seeing that this works fine several times as the app is opened and closed, but suddenly the audioEngine.start() call fails with the error
Error Domain=com.apple.coreaudio.avfaudio Code=-10851 "(null)" UserInfo={failed call=err = AUGraphParser::InitializeActiveNodesInInputChain(ThisGraph, *GetInputNode())}
and the audioEngine!.inputNode.outputFormat(forBus: 0) is something like
<AVAudioFormat 0x282301c70: 2 ch, 0 Hz, Float32, deinterleaved>
. Also, right before installing the tap, audioSession.availableInputs contains an entry of type MicrophoneBuiltIn but audioSession.currentRoute lists no inputs at all.
I was not able to reproduce this situation on my own devices yet.
Does anyone have an idea why this is happening?
What is the SwiftUI equivalent of TVLockupView, TVCaptionButtonView, and TVPosterView?
Has somebody figured out how to interact with the SwiftUI Preview for tvOS in Xcode 14 when the Live mode is enabled? The arrow keys do not seem to work (or even any other key or mouse click). The Live mode is pretty useless without user input.
An app that I'm working on uses CloudKit with various CKSubscriptions which started to work well again after the 503 errors began disappearing. However, after upgrading my iPhone 7 to iOS 15.3 my app does not receive background push notifications anymore. All the CloudKit subscriptions that use CKSubscription.NotificationInfo(shouldSendContentAvailable: true) do not work anymore, i.e. the AppDelegate's application(_: didReceiveRemoteNotification: fetchCompletionHandler:) method is no longer called.
Actually, after rebooting the device my app receives one single notification, the first one that arrives, but not more.
The app runs still fine on macOS 12.1 and tvOS 15.2 and it was running fine on iOS 15.2.1 before the upgrade.
Anyone seeing similar issues or have ideas that could help? I filed a report on the feedbackassistant but I don't expect much help from there.
wwdc20-10673 briefly shows how to visualize optical flow generated by VNGenerateOpticalFlowRequest and sample code is available through the developer app. But how can we build the OpticalFlowVisualizer.ci.metallib file from the CI-kernel code provided as OpticalFlowVisualizer.cikernel?
What is the correct way to draw an anti-aliased line in SwiftUI?
Let's say a straight, white line on black background, 2 points wide and slightly rotated off the horizontal orientation.
I've tried a Rectangle() with .fill(style: FillStyle(antialiased: true)) and .frame(width: 2, height: 100), but setting antialiased to true or false did not make any difference. I could not find any anti-aliasing options for Path.