I have followed the SoupChef example in migrating Custom Intents from SiriKit to AppIntents. However, we only require one iOS release back, so we can require iOS 17. Thus, I eliminated everything that was strictly for backwards compatibility, most notably the SiriKit Extension that required enormous amounts of code to try to coordinate with the real app which worked poorly anyway.
I tested for example that an NFC tag Automation created in Shortcuts works to execute an AppIntent while the app is backgrounded.
I am now receiving a beta report that indicates someone trying to execute one of our migrated AppIntents from their HomePod is not working, and they say it used to work sometimes (not all the time). I'm sure most such cases used to require the SiriKit Extension in the old SiriKit world. I am terrified that I may need to rebuild that monster once again when the new (to me) AppIntent API seemed so beautiful without it and seemed to work without it. The AppIntent API documentation seems to indicate that SiriKit Extensions are no longer related or required. What is the truth here? Do I need to re-implement everything twice in the SiriKit Extension like a barbarian, or can we live in the new world with AppIntents?
Thank you.
Post
Replies
Boosts
Views
Activity
Need our app to launch in Full Screen mode. Have reviewed many years of prior solutions for this. All of them seem to be non-functional on modern Catalyst. There are some years where users suggest twiddling methods to use toggleFullScreen in Cocoa. That seems to no longer work in my tests? Then most recently Apple posts from last Summer announced a new key UILaunchToFullScreenByDefaultOnMac, but after reading lots of fine print it has a very strange scenario in which it does not work: Catalyst apps! So only unmodified iPad apps can use it! Very frustrating.
So anyway, I have spent many days since last Summer trying again and again to figure out how to launch in Full Screen Mode. I've even tried AppleScript, but as it turns out that doesn't work either as for some reason you need to constantly reset the privacy permissions.
Another feature from last Summer was in the 16.1 SDK, a new willConnectToSession attribute for allowsFullScreen. Unfortunately, that has no apparent effect. Allowing is different from engaging, and the "Enter Full Screen" menu item works fine so it was always allowed. The point is to launch in Full Screen all the time, or at least all the time when the user hasn't overridden something.
I feel like there are so many proposed solutions to this, so many people who must need this, that I am surely missing something. What is the correct way to launch in Full Screen for a full-fledged Catalyst app available on the Mac App Store (and thus subject to all relevant restrictions thereof)?
The com.apple.developer.device-information.user-assigned-device-name entitlement works fine on iOS.
Added to the exact same app, same Profile, on Catalyst. Build fine, installs fine from Xcode. Upload to App Store Connect. Failed upload due to Provisioning Profile not including com.apple.developer.device-information.user-assigned-device-name. Yet, manually verified that the Provisioning Profile does indeed include that entitlement and so does the App ID.
Feels like an App Store Connect bug. Also verified that Ventura does indeed fail to return the name without this entitlement. Thus, this is a critical block.
Our app iterates all HomeKit accessories. Sometimes, HomeKit video cameras appear correctly as an HMService with a Camera type. For instance Logitech and ElGato's cameras appear as one would expect this way.
Often lately, I am encountering HMAccessories with no HMCategory. It is literally nil. Cameras from Arlo and Eufy are both presenting as such, and this behavior presents in the user base as well.
When I encountered the first of these HomeKit certified devices identifying as no HMCategory, I simply special cased it for Arlo and called it a video camera so that we could load it as a camera.
Now yet another manufacturer is doing the same thing and I'm starting to question whether I'm doing something incorrectly. Why would Eufy and Arlo both have no HMCategory? It seems like an insanely huge bug. I note though that the Home app identifies them as a camera, so there may be some twisted way to decide they must be a camera. Are we supposed to assume no category means Video Camera for some reason? What am I missing here?
We have the Multicast entitlement, it is a crucial feature used for basically every aspect of our apps. It seems to work when applied to one of our iOS apps. Once granted, it is now possible to apply it to any Profile for any of our team's apps. The following are subsequent questions I'm not sure about.
All of our apps require Multicast. One is not published yet. Applying for this requires an App Store URL. Do we need to apply again for every single app even from the same team? Since there is no URL for the new app but the entitlement is allowed and App Store build submission succeeds, it's not clear if the team has the entitlement, or the bundle ID does.
The official documentation for this entitlement says it also applies to macOS 11 and tvOS 14:
https://developer.apple.com/documentation/bundleresources/entitlements/com_apple_developer_networking_multicast?language=objc
However, so far I have been unable to get a build signed and submitted for either Catalyst or tvOS due to provisioning errors if I include that entitlement. Can I safely assume this entitlement is not needed for either platform in this release cycle (tvOS 14/macOS 11)? I have not yet observed actual functional loss on either of those platforms as we have on new iOS 14 builds.
Similarly, we have a Widget (pre-iOS 14 ObjC style) and a SiriKit Extension. They both have entitlement files. Adding this entitlement to either of them also seems to cause errors. Yet, both of them need multicast. Can I assume this entitlement will apply to all extensions within our app even though declared only in the top level entitlements file? Or perhaps we need to apply for every single bundle ID of every single extension inside every single app separately?
It is possible/likely some of my tests above failed due to this morning's developer site maintenance which began right after the iOS app worked in my tests. So before I go too far down this path, I wanted to clarify the scope needed here for ObjC Widgets, SiriKit, Catalyst, and tvOS. We were quite surprised when suddenly sockets multicast stopped working, so are concerned about whether the same surprise functionality change might occur in future minor releases of macOS/tvOS given the documentation.
Thanks.
We donate our custom intents to SiriKit. Our intents have a couple parameters like:
Start Foo in Boo
The generic intent title is "Start Things". We need to show the specific title with params and not the generic title in Shortcuts.
The problem is that the Shortcuts app does not distinguish our intents so we get tons and tons of "Start Things" suggestions for each different Foo, but each line in Shortcuts Add Action and Suggestions is simply "Start Things" for many pages over time as the user performs actions in our app.
I have noticed some other apps are able to customize that title. Find My shows a title as "Show the location of SpecificiPadName". OpenTable shows "Get details for 'specific reservation name'".
I have not been able to do the same kind of customization. In Xcode intent definition, the Shortcuts app and Suggestions section for the Intent does not seem to be related to the string used in the actual Shortcuts app. The generic title of the intent is always used in Shortcuts until you open the suggestion at which point you can customize the params.
Either we need to disable suggestions altogether to avoid the user having pages of un-distinguished titles, or we need to customize the title to show its parameters. I note that even though we have 2 parameters, the "Key Parameter" menu in the Shortcuts app section of the Intent is disabled – I cannot select a parameter there. Also, the "Summary" in the Shortcuts app section is simply not what is shown in Shortcuts.
Anyway, obviously there is some trick to this, but I have not been able to find it. Anyone have any insight into this?
I saw this issue last Summer, but ignored it because it didn't interfere with development. Whenever I submit our Achived app or try to export for dev team, code signing fails with "Code signing "A" failed." In the associated logs, this is what I see:
2020-11-11 18:38:44 +0000	Running /usr/bin/codesign '-vvv' '--force' '--sign' 'ZZZZZZZ' '--entitlements' '/var/folders/vg/32z4zj4949x0r2tr6xtg1rg00000gn/T/XcodeDistPipeline.~~~1xAU4n/entitlements~~~xhX9bY' '--preserve-metadata=identifier,flags,runtime' '/var/folders/vg/32z4zj4949x0r2tr6xtg1rg00000gn/T/XcodeDistPipeline.~~~1xAU4n/Root/Applications/MyApp.app/Contents/Frameworks/MyKit.framework/Versions/A'
2020-11-11 18:38:44 +0000	/var/folders/vg/32z4zj4949x0r2tr6xtg1rg00000gn/T/XcodeDistPipeline.~~~1xAU4n/Root/Applications/MyApp.app/Contents/Frameworks/MyKit.framework/Versions/A: replacing existing signature
2020-11-11 18:38:44 +0000	/var/folders/vg/32z4zj4949x0r2tr6xtg1rg00000gn/T/XcodeDistPipeline.~~~1xAU4n/Root/Applications/MyApp.app/Contents/Frameworks/MyKit.framework/Versions/A: code object is not signed at all
2020-11-11 18:38:44 +0000	/usr/bin/codesign exited with 1
The structure is MyApp.app contains MyKit.framework. I would explain the exact settings here and there, but I have flipped a ton of them back and forth trying to resolve. Which cert I use, auto or manual signing, adding "--deep" to the signing flags, code sign on copy, embed, and on and on. It's been an endless series of re-archives for days now without success. Obviously, one little bit of magic somewhere is being missed.
The iOS build works great. The Catalyst build with same framework used to work great and is on the App Store. Once we migrated to any version of Xcode that supported Apple Silicon, this error began. I had assumed it must be an Xcode bug, but here we are at production time and it remains present. I am urgently trying to work through this, but feel I've tried every path. Anyone else seeing this or have clues to a solution? Thanks.
We use a custom video player that gets frames out of an RTSP stream and renders them into a Metal view. It works well.
I'm trying to parse out how to translate this to support Picture in Picture. AVPictureInPictureController takes a AVPlayerLayer which takes a AVPlayer which is formed either from a URL or media list. Neither of which are applicable here.
Can I render the metal video into the AVPlayerLayer directly? That seems like the right path, but some of the documentation implies our app won't be called to render frames even if the PiP background mode is turned on in that case.
Basically, I'm trying to get a clear answer to "can I render video with PiP using Metal". As there is no "URL" I can pass for this, that is the only path that seems available. Thanks.
We have a working widget extension, for many years, works on iOS 12 and 13 these days. It sizes itself based on the sizes reported by widgetMaximumSizeForDisplayMode and widgetActiveDisplayModeDidChange.
On iOS 14 betas 1, 2, and now 3, the system reported sizes are incorrect. For instance on the iPhone 11 Pro Max simulator it initially reports 398, 110 which is way too wide. After I swipe out and then back to reload all widget extensions, it then reports 398, 102, also too wide. Finally after enough forced reloading, it reports the correct sizing 360, 110, the correct size. So basically, the widget is incorrectly sized most of the time.
Oddly, I added tens of widget extensions to my list to see who else is having this issue, and I did not see others suffering from it. Possibly, this is because I'm using the iOS 14 SDK and thus causing iOS 14 to treat my app as if I'm supposed to know some secret new sizing technique that I do not.
The new WidgetKit stuff is incompatible with the fairly interactive model for our widget, so I need to get the extension working by fixing this sizing issue while hoping WidgetKit matures for next year.
Perhaps this is a beta bug? If so, past experience dictates it would have been fixed by now and I always get bitten thinking things are such obvious beta bugs that they will be fixed. The fact that other widget extensions are generally working though indicates perhaps there is some other way to get the correct size (it does seem very strange that I'm not aware of an auto-layout way to handle this and I'm still being passed raw sizes?), or perhaps I can cause the system to re-layout then pass me the correct size.
Any ideas on this?
We have a production app built on Catalyst. Many user reports have been received that a network listener for local network events related to our app is extremely slow when our app is not in the foreground. User reports indicate a delay of as many as 13 seconds can occur between the time a network event occurs and the time the app receives the packet.This has been exceptionally hard to reproduce. I believe it has something to do with the system deciding to put that app into a less active state when the app is not foreground. Our background capabilities are just Background Fetch and Remote Notifications.Simply putting the app into the background is not sufficient to reproduce. There are other unknown factors. We presume they are system related such as the amount of CPU load the system is under, number of other apps, etc. Basically, the system at some point decides to demote us and then we become unusable with massive 13 second response delays.On iOS, that is completely reasonable because our app is not visible. On macOS, it is completely absurd because our app is plainly visible and effectively broken.Is this a known issue with Catalyst? What would be the best workaround? At the moment, I've only been coming up with what are essentially hacks like telling macOS that we are playing audio, or doing something like that to ensure consistent background time. But that seems like a huge OS bug that there must be a better way to handle.
I'm trying to enable a use case to allow a user to add a Scripting step in Shortcuts that says:IF <my app's custom state> is <ON>,do Action.In Shortcuts, if I select <input>, the options are "Select a Home Accessory", "Clipboard", or "Current Date". I am hoping that list is extensible (it's otherwise comically short), but cannot find how I would extend that as a developer.Am I missing how to do this or are these three elements truly the only conditionals possible?Thanks.
We have a need for continuous background processing as we are a network listener observing the state of many local network items that have no understanding of anything like push notifications, etc.On iOS this can be solved via Guided Access, thus forcing foreground access on a permanent basis. In our case, users are perfectly fine with that as they purchase the Apple devices specifically for this software.On Catalyst, there is no Guided Access. I had assumed this would be trivial because Cocoa doesn't stop execution for background apps, but Catalyst appears to simulate iOS by trying to stop any app not in the foreground even when the machine does not even have a battery (eg. iMac) and the concept of background tasks framework to save battery makes no sense, and it does so pretty quickly (30 seconds - 3 minutes).In reviewing the new background APIs, all of it seems to ignore these use cases. My real question as per the title is "how do I get continuous background processing time on Catalyst?" As I have asked that fruitlessly for a decade on iOS, in this case I will modify the need to be more realistic, "given reality that Apple does not (yet) recognize network listeners as a valid background activity, what background behavior can I use on Catalyst to ensure as close to permanent background execution as possible?"Note that, while we require continuous background execution, our actual CPU time is essentially zero. We just listen for specific network activity and take actions based on it. So I'm trying to figure out the best heuristic to get the longest lasting execution time, specifically on Catalyst. Thanks for any advice!
Starting early 2018, everything that has been announced as added to HomeKit such as HomePod, AppleTVs, Airplay 2 devices, and now TVs have not in fact been added to the actual HomeKit APIs. HMAccessoryCategoryTypes.h hasn't changed a bit. Iterating over all accessories of all types does not include any such accessories. So all of those things are not actually part of HomeKit, they seem to be just in the Home app only and not available via the HomeKit API at all.Needless to say, this is to put it gently inconsistent and users of apps that rely on HomeKit do not understand why HomeKit apps cannot access what otherwise seems obviously to be a HomeKit accessory in the Home app, not knowing that all such accessories have not actually been made part of HomeKit.Is anyone aware of some kind of statement or can someone please make a statement regarding what the expectation is here? Given the HomeKit silence at WWDC '18, I feel like it could be better to implement HAP in our app rather than use the HomeKit API at this point if we can't get a clear statement that the HomeKit API will be kept up to date as it is now at least a year visibly out of date with devices that users are incorrectly made to believe are part of HomeKit.