Posts

Post not yet marked as solved
2 Replies
633 Views
I'm trying to resize NSImages on macOS. I'm doing so with an extension like this. extension NSImage { // MARK: Resizing /// Resize the image to the given size. /// /// - Parameter size: The size to resize the image to. /// - Returns: The resized image. func resized(toSize targetSize: NSSize) -> NSImage? { let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height) guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else { return nil } let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in return representation.draw(in: frame) }) return image } } The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached. This becomes pure red when examing the image result. If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
I've got a UIAutomation test that tests an app with a Share Extension. I want to pass in some launchArguments and launchEnvironment properties to the Share Extension in my test. I can't figure out a way of doing it. When debug the ProcessInfo in the extension my values aren't there. I've tried setting them like this: XCUIApplication(bundleIdentifier:"com.my.app").launchArguments = ["TEST"] XCUIApplication(bundleIdentifier:"com.my.app").launchEnvironment = ["TEST":"TEST"] and this: XCUIApplication().launchArguments = ["TEST"] XCUIApplication().launchEnvironment = ["TEST":"TEST"] Note, I am correctly getting the correct process when I'm finding it with the bundle identifier.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.5k Views
I've been watching the WWDC videos on the new App Intents framework in iOS. It definitely looks like a nicer API over the Siri Intents framework. However, what's not clear to me is are there any user facing improvements or increased discoverability to it over the existing Siri Intents framework? I'm already indexing my Shortcuts manually whenever the app is opened, which seems like one of the headline features of App Intents. I've got an app that uses Siri Intents quite extensively and I don't really want to have two implementations side by side if there's no tangible benefit. I'd rather just leave the code as is until I can drop iOS 15.
Posted Last updated
.
Post not yet marked as solved
0 Replies
1.3k Views
Is it possible to use PNGs for Watch Series 7 complication sizes? In the human interface guidelines here https://developer.apple.com/design/human-interface-guidelines/watchos/overview/complications/ it specifies a size for the new 41mm and 45mm devices. However, in the latest Xcode (13.2.1) I can't see any way to bring up a slot for you to drop in images at these sizes; all you can get is a slot for a vector image and some of the older device sizes. Also, it seems if you don't provide a vector your existing png complication's icons appear slightly cropped when used on a 45mm device. Is the only way to support Watch Series 7 complications to use a vector or is there some hidden way or tick box somewhere to get pngs to work?
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.5k Views
I want to create a sort of soundscape in surround sound. Imagine something along the lines of the user can place the sound of a waterfall to their front right and the sound of frogs croaking to their left etc. etc. I have an AVAudioEngine playing a number of AVAudioPlayerNodes. I'm using AVAudioEnvironmentNode to simulate the positioning of these. The position seems to work correctly. However, I'd like these to work with head tracking so if the user moves their head the sounds from the players move accordingly. I can't figure out for to do it or find any docs on the subject. Is it possible to make AVAudioEngine output surround sound and if it can would the tracking just work automagically the same as it does when playing surround sound content using AVPlayerItem. If not is the only way to achieve this effect to use CMHeadphonemotionmanager and manually move the listener AVAudioEnvironmentNode listener around?
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.4k Views
I have a Siri Intent that does a UDP broadcast to invoke WOL. In iOS 14 I had to request the com.apple.developer.networking.multicast entitlement to enable my app to do this. With this entitlement my app is able to perform this function. However, my Siri Intent fails to. I’m guessing that it’s because the Siri Intent target doesn’t itself include this entitlement. I can’t figure out how to setup, or if I need to separately request permission, this entitlement to my Siri intent? If I add it to my intent’s entitlement file the same way as I did to my main target’s entitlements file automatic signing fails.
Posted Last updated
.