I have an app that plays audio and the behaviour of it has changed in watchOS 11. I can no longer figure out how to play the audio through the headphones.
To play audio I..
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playback,
mode: .default,
policy: .longFormAudio,
options: []
let activated = try await session.activate()
if activated {
// play audio
}
In previous versions, 'try await session.activate()` would bring up a route picker where the user could select their headphones. Now on watchOS 11 it just plays the audio out of the speaker.
Maybe that's what some people want but if they do want it to play out of the headphones I can't see how I can give that option now? There's no AVRoutePickerView available on watchOS for selecting it.
I've tried setting the category to .multiRoute instead of .playback and that does bring up the picker but selecting the speaker results in an error code and selecting the headphones results in it saying it cannot find my headphones (which shouldn't be the case since Apple Music on watchOS finds them).
Tried overriding the output with try session.overrideOutputAudioPort(.speaker) but the compiler complains that speaker isn't available on watchOS, which is strange as if I understand correctly it's possible to play through the speaker now at least on some Apple Watches.
So is this a bug or is there some way I've not found of playing audio through the headphones?
Post
Replies
Boosts
Views
Activity
After building my app with Xcode 16 beta 6 I'm getting this warning in my AppIntents.
Encountered a non-optional type for parameter: computer. Conformance to the following AppIntent protocols requires all parameter types to be optional: AppIntents.WidgetConfigurationIntent, AppIntents.ControlConfigurationIntent
The intent looks something like this
struct WakeUp: AppIntent, WidgetConfigurationIntent, PredictableIntent {
@Parameter(title: "intent.param.computer", requestValueDialog:"intent.param.request_dialog.computer")
var computer: ComputerEntity
init(computer: ComputerEntity) {
self.computer = computer
}
init() {
}
public static var parameterSummary: some ParameterSummary {
Summary("Wake Up \(\.$computer)")
}
static var predictionConfiguration: some IntentPredictionConfiguration {
IntentPrediction(parameters: (\.$computer)) { computer in
DisplayRepresentation(
title: "Wake Up \(computer)"
)
}
}
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog {
}
}
According to the docs though specifying optional is how we say if the value is required or not. https://developer.apple.com/documentation/appintents/adding-parameters-to-an-app-intent#Make-a-parameter-optional-or-required
So is this warning accurate? If so, how do I specify that a parameter is required by the intent now?
I'm trying to resize NSImages on macOS. I'm doing so with an extension like this.
extension NSImage {
// MARK: Resizing
/// Resize the image to the given size.
///
/// - Parameter size: The size to resize the image to.
/// - Returns: The resized image.
func resized(toSize targetSize: NSSize) -> NSImage? {
let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height)
guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else {
return nil
}
let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in
return representation.draw(in: frame)
})
return image
}
}
The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached.
This becomes pure red when examing the image result.
If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?
I've got a UIAutomation test that tests an app with a Share Extension. I want to pass in some launchArguments and launchEnvironment properties to the Share Extension in my test.
I can't figure out a way of doing it.
When debug the ProcessInfo in the extension my values aren't there. I've tried setting them like this:
XCUIApplication(bundleIdentifier:"com.my.app").launchArguments = ["TEST"]
XCUIApplication(bundleIdentifier:"com.my.app").launchEnvironment = ["TEST":"TEST"]
and this:
XCUIApplication().launchArguments = ["TEST"]
XCUIApplication().launchEnvironment = ["TEST":"TEST"]
Note, I am correctly getting the correct process when I'm finding it with the bundle identifier.
I've been watching the WWDC videos on the new App Intents framework in iOS.
It definitely looks like a nicer API over the Siri Intents framework. However, what's not clear to me is are there any user facing improvements or increased discoverability to it over the existing Siri Intents framework? I'm already indexing my Shortcuts manually whenever the app is opened, which seems like one of the headline features of App Intents.
I've got an app that uses Siri Intents quite extensively and I don't really want to have two implementations side by side if there's no tangible benefit. I'd rather just leave the code as is until I can drop iOS 15.
Is it possible to use PNGs for Watch Series 7 complication sizes?
In the human interface guidelines here https://developer.apple.com/design/human-interface-guidelines/watchos/overview/complications/ it specifies a size for the new 41mm and 45mm devices. However, in the latest Xcode (13.2.1) I can't see any way to bring up a slot for you to drop in images at these sizes; all you can get is a slot for a vector image and some of the older device sizes. Also, it seems if you don't provide a vector your existing png complication's icons appear slightly cropped when used on a 45mm device.
Is the only way to support Watch Series 7 complications to use a vector or is there some hidden way or tick box somewhere to get pngs to work?
I want to create a sort of soundscape in surround sound. Imagine something along the lines of the user can place the sound of a waterfall to their front right and the sound of frogs croaking to their left etc. etc.
I have an AVAudioEngine playing a number of AVAudioPlayerNodes. I'm using AVAudioEnvironmentNode to simulate the positioning of these. The position seems to work correctly. However, I'd like these to work with head tracking so if the user moves their head the sounds from the players move accordingly.
I can't figure out for to do it or find any docs on the subject. Is it possible to make AVAudioEngine output surround sound and if it can would the tracking just work automagically the same as it does when playing surround sound content using AVPlayerItem. If not is the only way to achieve this effect to use CMHeadphonemotionmanager and manually move the listener AVAudioEnvironmentNode listener around?
I have a Siri Intent that does a UDP broadcast to invoke WOL.
In iOS 14 I had to request the com.apple.developer.networking.multicast entitlement to enable my app to do this. With this entitlement my app is able to perform this function. However, my Siri Intent fails to. I’m guessing that it’s because the Siri Intent target doesn’t itself include this entitlement.
I can’t figure out how to setup, or if I need to separately request permission, this entitlement to my Siri intent? If I add it to my intent’s entitlement file the same way as I did to my main target’s entitlements file automatic signing fails.