I have a UIImage subclass that holds additional metadata (Yes, I could resort to composition instead of inheritance, but that would require a major rewrite of my code).Instance of this subclass are not displayed in WKInterfaceImage.Any idea how to fix that?
Post
Replies
Boosts
Views
Activity
I'm trying to export a series of JPG images into a H264 movie (ideally it would be a motion JPEG movie, but unfortunately out AVAssetWriterdoes not support this codec). The images originate from a professional surveillance camera with VFR (variable frame rate), hence I compute the CMTimebased on the time they have been captured, which results in a non-constant frame rate.When I do this, the AVAssetWriter always fails to render the movie:Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-16364), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x604000245a90 {Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"}}When I change the CMTime instances to be e.g.CMMakeTime( frameIndex * 150, 600 )it works, albeit resulting in a slightly wrong output.Any idea how I could fix this? Is AVAssetWriter supposed to support that scenario?
I have a program that handles INPlayMediaIntent in an extension. This works fine as long as you are using Siri in english. Switching to another language makes Siri consider the query as a web search. To reproduce, please check out https://github.com/mickeyl/iOS-Bug-Example-Projects and have a look at the Siri-Localized-MediaIntents-Broken folder.Then:– Switch Siri to english– Edit the extension run scheme to feed Siri with, e.g., "Play Running in SiriTest"– Result is fine, Siri recognizes that you’re talking about <appname> and runs the extension (actually the first time it asks you whether it can have access to SiriTest data, but that alone means it's working fine).Now:– Switch Siri to german– Edit the extension run scheme to feed Siri with the query, "Spiele Running in SiriTest“– Result is wrong, Siri does not recognize that SiriTest has been referred and considers this as a web search, thus reporting that it has not found anything meaningful.Is this a bug (which is what I'm suspecting) or am I dong anything wrong here?
Why is the UIKeyboard implementation still holding a reference to this UITextField, thus keeping it from being deallocated?
The memory debugger shows:
UIKeyboardImpl -> UIKBAutofillController -> NSMutableDictionary -> NSMutable...(Storage) -> UITextField
Any idea what's going on there?
I'm writing a command line tool for macOS which interfaces with BLE devices. I have a problems with regards to permissions:
If I launch my tool on the command line, it gets killed by the OS. Only if I launch it via the debugger, I get the alerter to allow the bluetooth permission.
My plist that contains the NSBluetoothAlwaysUsageDescription key is embedded as __TEXT __info_plist in the binary. Is this no longer enough for a command-line tool to access security-guarded OS facilities these days?
It looks like CoreBluetooth is completely broken in macOS 12 – at least for command line apps. I have a pretty simple scanning app and I no longer get any delegate calls for discovered devices. I DO get the "power up" state change delegate call though.
Any idea?
…and nothing more on the page. Not only on my account, but also on other teams I have access to.
Is that expected? Will it be rolled out later for beta participants or is it just broken due to high demand?
My app uses UDP broadcast to find out a peer on a private network (169.254..). With iOS 14 and iOS 15 everything works fine, but on the newly released iOS 16 release candidate I get a 'Socket Error 65: No Route To Host'.
I don't have any special entitlements for this app. Did anything change with regards to UDP broadcast policies from iOS 15 to iOS 16?
Is there any way to speed up the processing time for the multicast entitlement? I did request it at the same time as one of my colleagues. We already both have successful apps in the store and we more or less used the same explanation.
My colleague got it after 3 days, I'm still waiting for 2 weeks now. This is a bit sad as I've worked several months on an app that uses UDP multicast to find out link-local device peers and this is a complete BLOCKER for me.
Is there any proper documentation for how the thread properties qualityOfService and priority are related?
Is one the broad sword and the other one the fine one? If so, which is which?
Background: I have an iOS app that communicates with hardware. For some operations, the hardware has tight timelines, hence I would need to prioritize the communication thread over the UI thread (which is rare, but in this case necessary in order to prevent damaging the hardware).
Can anyone shed light on that or can I read up on the exact scheduler's behavior?
I'm wondering whether anyone @ Apple is still working on the External Accessory framework. In particular I find it quite sad that the EABluetoothAccessoryPicker has been broken since iOS 13, but apparently no one cares.
For the records, it's also FB9856371.
For iOS, TestFlight beta review is only necessary when you updated the version number. For macOS, I find the beta review to be necessary every time the build version updates. Is this expected?
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
I'm working on hardware that communicates wireless and wired with mobile systems. Anything non-i[Pad]OS we can connect via USB and achieve great bandwidth, in situations where this is necessary.
Since i[pad]OS does not support FTDI class compliant devices through USB (and also omits the IOUSB framework), I wonder whether we have a way to "work around" this, e.g. how about (ab)using another protocol that i[pad]OS allows?
Concretely, would you think it's possible to tunnel our serial data stream via USBHID?
I can't seem to upload my standalone watchOS app to TestFlight. Xcode never offers me the proper distribution scheme. I only get file-based distribution mechanisms.
The metadata in AppStoreConnect are present, the bundle ids are correct, this is a freshly new created project without any custom content.
How is this supposed to work?