This is a standalone SwiftUI app w/ watchOS 11.
To make things simple, I'm using the new concurrent API, like that:
for try await update in CLLocationUpdate.liveUpdates() { … }
watchOS is supposed to show the auth dialog, but it doesn't show up, even though update.authorizationRequestInProgress is true.
What could be the reason?
(Note that I also tried with the procedural old CLLocationManager API, it doesn't work that way as well)
Post
Replies
Boosts
Views
Activity
I can't seem to upload my standalone watchOS app to TestFlight. Xcode never offers me the proper distribution scheme. I only get file-based distribution mechanisms.
The metadata in AppStoreConnect are present, the bundle ids are correct, this is a freshly new created project without any custom content.
How is this supposed to work?
My team and I are working on an app for a private emergency helpline.
Now as far as I understand the (sparse) API documentation for fall detection, given the appropriate entitlement, the following happens upon a detected fall:
The Standard UI will be opened with the options to a) call SOS b) acknowledge the fall but state that you're fine though, c) deny the fall, and (implicitly after 60 seconds on inactivity) call SOS because you didn't react.
All fall detection apps will then receive a bit of background time and get the func fallDetectionManager(CMFallDetectionManager, didDetect: CMFallDetectionEvent, completionHandler: () -> Void) called with the appropriate event value.
Now that's all good and it sounds like the custom fall detection is additive to the standard system.
But but why is there something like that in the entitlement request form sheet:
For any emergency calling features that you do not provide, explain any mitigations you use to make sure the user receives emergency services support that’s as close as possible to what they’d receive had they placed an emergency call natively.
This sounds like our app would rather be a drop-in to the standard SOS service – in contrast to being additive and also in contrast to what the API documentation infers.
Am I misunderstanding something?
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
I'm working on hardware that communicates wireless and wired with mobile systems. Anything non-i[Pad]OS we can connect via USB and achieve great bandwidth, in situations where this is necessary.
Since i[pad]OS does not support FTDI class compliant devices through USB (and also omits the IOUSB framework), I wonder whether we have a way to "work around" this, e.g. how about (ab)using another protocol that i[pad]OS allows?
Concretely, would you think it's possible to tunnel our serial data stream via USBHID?
I'm writing a command line tool for macOS which interfaces with BLE devices. I have a problems with regards to permissions:
If I launch my tool on the command line, it gets killed by the OS. Only if I launch it via the debugger, I get the alerter to allow the bluetooth permission.
My plist that contains the NSBluetoothAlwaysUsageDescription key is embedded as __TEXT __info_plist in the binary. Is this no longer enough for a command-line tool to access security-guarded OS facilities these days?
My app uses UDP broadcast to find out a peer on a private network (169.254..). With iOS 14 and iOS 15 everything works fine, but on the newly released iOS 16 release candidate I get a 'Socket Error 65: No Route To Host'.
I don't have any special entitlements for this app. Did anything change with regards to UDP broadcast policies from iOS 15 to iOS 16?
For iOS, TestFlight beta review is only necessary when you updated the version number. For macOS, I find the beta review to be necessary every time the build version updates. Is this expected?
Why is the UIKeyboard implementation still holding a reference to this UITextField, thus keeping it from being deallocated?
The memory debugger shows:
UIKeyboardImpl -> UIKBAutofillController -> NSMutableDictionary -> NSMutable...(Storage) -> UITextField
Any idea what's going on there?
I'm wondering whether anyone @ Apple is still working on the External Accessory framework. In particular I find it quite sad that the EABluetoothAccessoryPicker has been broken since iOS 13, but apparently no one cares.
For the records, it's also FB9856371.
Is there any proper documentation for how the thread properties qualityOfService and priority are related?
Is one the broad sword and the other one the fine one? If so, which is which?
Background: I have an iOS app that communicates with hardware. For some operations, the hardware has tight timelines, hence I would need to prioritize the communication thread over the UI thread (which is rare, but in this case necessary in order to prevent damaging the hardware).
Can anyone shed light on that or can I read up on the exact scheduler's behavior?
Is there any way to speed up the processing time for the multicast entitlement? I did request it at the same time as one of my colleagues. We already both have successful apps in the store and we more or less used the same explanation.
My colleague got it after 3 days, I'm still waiting for 2 weeks now. This is a bit sad as I've worked several months on an app that uses UDP multicast to find out link-local device peers and this is a complete BLOCKER for me.
…and nothing more on the page. Not only on my account, but also on other teams I have access to.
Is that expected? Will it be rolled out later for beta participants or is it just broken due to high demand?
It looks like CoreBluetooth is completely broken in macOS 12 – at least for command line apps. I have a pretty simple scanning app and I no longer get any delegate calls for discovered devices. I DO get the "power up" state change delegate call though.
Any idea?
I'm trying to export a series of JPG images into a H264 movie (ideally it would be a motion JPEG movie, but unfortunately out AVAssetWriterdoes not support this codec). The images originate from a professional surveillance camera with VFR (variable frame rate), hence I compute the CMTimebased on the time they have been captured, which results in a non-constant frame rate.When I do this, the AVAssetWriter always fails to render the movie:Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-16364), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x604000245a90 {Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"}}When I change the CMTime instances to be e.g.CMMakeTime( frameIndex * 150, 600 )it works, albeit resulting in a slightly wrong output.Any idea how I could fix this? Is AVAssetWriter supposed to support that scenario?