This is a standalone SwiftUI app w/ watchOS 11.
To make things simple, I'm using the new concurrent API, like that:
for try await update in CLLocationUpdate.liveUpdates() { … }
watchOS is supposed to show the auth dialog, but it doesn't show up, even though update.authorizationRequestInProgress is true.
What could be the reason?
(Note that I also tried with the procedural old CLLocationManager API, it doesn't work that way as well)
Post
Replies
Boosts
Views
Activity
My team and I are working on an app for a private emergency helpline.
Now as far as I understand the (sparse) API documentation for fall detection, given the appropriate entitlement, the following happens upon a detected fall:
The Standard UI will be opened with the options to a) call SOS b) acknowledge the fall but state that you're fine though, c) deny the fall, and (implicitly after 60 seconds on inactivity) call SOS because you didn't react.
All fall detection apps will then receive a bit of background time and get the func fallDetectionManager(CMFallDetectionManager, didDetect: CMFallDetectionEvent, completionHandler: () -> Void) called with the appropriate event value.
Now that's all good and it sounds like the custom fall detection is additive to the standard system.
But but why is there something like that in the entitlement request form sheet:
For any emergency calling features that you do not provide, explain any mitigations you use to make sure the user receives emergency services support that’s as close as possible to what they’d receive had they placed an emergency call natively.
This sounds like our app would rather be a drop-in to the standard SOS service – in contrast to being additive and also in contrast to what the API documentation infers.
Am I misunderstanding something?
I can't seem to upload my standalone watchOS app to TestFlight. Xcode never offers me the proper distribution scheme. I only get file-based distribution mechanisms.
The metadata in AppStoreConnect are present, the bundle ids are correct, this is a freshly new created project without any custom content.
How is this supposed to work?
I'm working on hardware that communicates wireless and wired with mobile systems. Anything non-i[Pad]OS we can connect via USB and achieve great bandwidth, in situations where this is necessary.
Since i[pad]OS does not support FTDI class compliant devices through USB (and also omits the IOUSB framework), I wonder whether we have a way to "work around" this, e.g. how about (ab)using another protocol that i[pad]OS allows?
Concretely, would you think it's possible to tunnel our serial data stream via USBHID?
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
For iOS, TestFlight beta review is only necessary when you updated the version number. For macOS, I find the beta review to be necessary every time the build version updates. Is this expected?
I'm wondering whether anyone @ Apple is still working on the External Accessory framework. In particular I find it quite sad that the EABluetoothAccessoryPicker has been broken since iOS 13, but apparently no one cares.
For the records, it's also FB9856371.
Is there any proper documentation for how the thread properties qualityOfService and priority are related?
Is one the broad sword and the other one the fine one? If so, which is which?
Background: I have an iOS app that communicates with hardware. For some operations, the hardware has tight timelines, hence I would need to prioritize the communication thread over the UI thread (which is rare, but in this case necessary in order to prevent damaging the hardware).
Can anyone shed light on that or can I read up on the exact scheduler's behavior?
Is there any way to speed up the processing time for the multicast entitlement? I did request it at the same time as one of my colleagues. We already both have successful apps in the store and we more or less used the same explanation.
My colleague got it after 3 days, I'm still waiting for 2 weeks now. This is a bit sad as I've worked several months on an app that uses UDP multicast to find out link-local device peers and this is a complete BLOCKER for me.
My app uses UDP broadcast to find out a peer on a private network (169.254..). With iOS 14 and iOS 15 everything works fine, but on the newly released iOS 16 release candidate I get a 'Socket Error 65: No Route To Host'.
I don't have any special entitlements for this app. Did anything change with regards to UDP broadcast policies from iOS 15 to iOS 16?
…and nothing more on the page. Not only on my account, but also on other teams I have access to.
Is that expected? Will it be rolled out later for beta participants or is it just broken due to high demand?
It looks like CoreBluetooth is completely broken in macOS 12 – at least for command line apps. I have a pretty simple scanning app and I no longer get any delegate calls for discovered devices. I DO get the "power up" state change delegate call though.
Any idea?
I'm writing a command line tool for macOS which interfaces with BLE devices. I have a problems with regards to permissions:
If I launch my tool on the command line, it gets killed by the OS. Only if I launch it via the debugger, I get the alerter to allow the bluetooth permission.
My plist that contains the NSBluetoothAlwaysUsageDescription key is embedded as __TEXT __info_plist in the binary. Is this no longer enough for a command-line tool to access security-guarded OS facilities these days?
Why is the UIKeyboard implementation still holding a reference to this UITextField, thus keeping it from being deallocated?
The memory debugger shows:
UIKeyboardImpl -> UIKBAutofillController -> NSMutableDictionary -> NSMutable...(Storage) -> UITextField
Any idea what's going on there?
I have a program that handles INPlayMediaIntent in an extension. This works fine as long as you are using Siri in english. Switching to another language makes Siri consider the query as a web search. To reproduce, please check out https://github.com/mickeyl/iOS-Bug-Example-Projects and have a look at the Siri-Localized-MediaIntents-Broken folder.Then:– Switch Siri to english– Edit the extension run scheme to feed Siri with, e.g., "Play Running in SiriTest"– Result is fine, Siri recognizes that you’re talking about <appname> and runs the extension (actually the first time it asks you whether it can have access to SiriTest data, but that alone means it's working fine).Now:– Switch Siri to german– Edit the extension run scheme to feed Siri with the query, "Spiele Running in SiriTest“– Result is wrong, Siri does not recognize that SiriTest has been referred and considers this as a web search, thus reporting that it has not found anything meaningful.Is this a bug (which is what I'm suspecting) or am I dong anything wrong here?