I have a CALayer and I'd like to animate a property on it. But, the property that triggers the animation change is different to the one that is being changed. A basic example of what I'm trying to do is below. I'm trying to create an animation on count by changing triggerProperty. This example is simplified (in my project, the triggerProperty is not an Int, but a more complex non-animatable type. So, I'm trying to animate it by creating animations for some of it's properties that can be matched to CABasicAnimation - and rendering a version of that class based on the interpolated values).
@objc
class AnimatableLayer: CALayer {
@NSManaged var triggerProperty: Int
@NSManaged var count: Int
override init() {
super.init()
triggerProperty = 1
setNeedsDisplay()
}
override init(layer: Any) {
super.init(layer: layer)
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override class func needsDisplay(forKey key: String) -> Bool {
return key == String(keypath: \AnimatableLayer.triggerProperty) || super.needsDisplay(forKey: key)
}
override func action(forKey event: String) -> (any CAAction)? {
if event == String(keypath: \AnimatableLayer.triggerProperty) {
if let presentation = self.presentation() {
let keyPath = String(keypath: \AnimatableLayer.count)
let animation = CABasicAnimation(keyPath: keyPath)
animation.duration = 2.0
animation.timingFunction = CAMediaTimingFunction(name: CAMediaTimingFunctionName.linear)
animation.fromValue = presentation.count
animation.toValue = 10
return animation
}
}
return super.action(forKey: event)
}
override func draw(in ctx: CGContext) {
print("draw")
NSGraphicsContext.saveGraphicsState()
let nsctx = NSGraphicsContext(cgContext: ctx, flipped: true) // create NSGraphicsContext
NSGraphicsContext.current = nsctx // set current context
let renderText = NSAttributedString(string: "\(self.presentation()?.count ?? self.count)", attributes: [.font: NSFont.systemFont(ofSize: 30)])
renderText.draw(in: bounds)
NSGraphicsContext.restoreGraphicsState()
}
func animate() {
print("animate")
self.triggerProperty = 10
}
}
With this code, the animation isn't triggered. It seems to get triggered only if the animation's keypath matches the one on the event (in the action func).
Is it possible to do something like this?
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
Hello Apple Developer Community,
I am encountering an issue with app icon rendering after updating an app on devices running iOS 18 or newer. Below are the details:
Issue Summary:
When updating an app from a previous version (with separate light and dark mode icons) to the latest version (where both modes use the same icon), the icon changes are not reflected consistently across all system menus.
Steps to Reproduce:
Set the device mode to Dark Mode.
Install the previous app version (with different icons for light and dark modes).
Update the app to the latest version (where both modes use the same icon).
Change the device mode to Light Mode.
Switch back to Dark Mode.
Expected Behavior:
The app icon should remain consistent across all system menus (Home Screen, Spotlight search, etc.) when switching between Light and Dark Modes.
Observed Behavior:
The app icon displays correctly on the Home Screen but inconsistencies appear in other menus, such as Spotlight search or when toggling between modes.
For instance, in Dark Mode, the icon may revert to the previous black-colored logo or display incorrectly compared to the updated design.
Additional Notes:
The asset catalog is configured correctly, with identical icons set for both light and dark modes in the latest app version.
Incrementing the build number was implemented during the update.
A manual device restart resolves the issue on some devices, but not consistently.
Questions for the Community:
Has anyone else experienced similar app icon caching or rendering issues in iOS 18 or later?
Are there known workarounds or specific configurations to ensure consistent icon rendering across all system menus?
Could this be related to iOS 18's icon caching or appearance handling mechanisms?
Your insights and suggestions would be greatly appreciated. Thank you for your time!
In my iOS App I present a QLPreviewController where I want to display a locally stored Video from the iPhone's document directory.
let previewController = QLPreviewController()
previewController.dataSource = self
self.present(previewController, animated: true, completion: nil)
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = urlForPreview
return url! as QLPreviewItem
}
This seems to work fine for all but one of my testflight users. He is using an iPhone 12 with iOS18.0.1. The screen becomes unresponsive. He cannot pause the video, share it or close the QLPreviewController.
In his logfile I see the following error...
[AVAssetTrack loadValuesAsynchronouslyForKeys:completionHandler:] invoked with unrecognized keys (
"currentVideoTrack.preferredTransform")
Any ideas?.
Problem
I am developing a WebDriver agent for automation and using dictionaryRepresentation to retrieve the coordinates of the iOS app hierarchy. However, I am encountering an issue with the accuracy of the x and y coordinates.
Approach Tried
I tested the setup on:
iPhone 12 Pro Max (iOS 16.2): Accuracy issues with the coordinates were observed.
iPhone SE (3rd Generation) (iOS 16.2): Coordinates were accurate for tap actions, with no issues identified.
Observation
It appears that devices with fingerprint biometric authentication provide accurate coordinates.
Can anyone help here to understand is there anything wrong in the code. Are do we have to adjust frame of the element for different devices?
Sample Code
- (NSDictionary *)json_tree
{
NSDictionary<XCUIElementAttributeName, id>
*dictionaryRepresentation = [[self snapshotWithError:nil] dictionaryRepresentation];
return [self.class dictionaryForElementAttributes:dictionaryRepresentation recursive:YES];
}
// This method converts the dictionary to CGRect, handling any broken frame values (e.g., Infinity)
+ (CGRect)handleBrokenFrameFromDict:(id)frameDict {
if ([frameDict isKindOfClass:[NSDictionary class]]) {
CGFloat originX = [frameDict[@"X"] floatValue];
CGFloat originY = [frameDict[@"Y"] floatValue];
CGFloat sizeWidth = [frameDict[@"Width"] floatValue];
CGFloat sizeHeight = [frameDict[@"Height"] floatValue];
CGRect frame = CGRectMake(originX, originY, sizeWidth, sizeHeight);
// Replace Infinity values with CGRectZero
return (isinf(frame.size.width) || isinf(frame.size.height)
|| isinf(frame.origin.x) || isinf(frame.origin.y))
? CGRectZero // or another predefined constant like BROKEN_RECT
: CGRectIntegral(frame);
}
return CGRectZero; // If frameDict is not a valid dictionary, return CGRectZero
}
// This method converts CGRect into a dictionary representation for "rect"
+ (NSDictionary *)rectDictionaryFromCGRect:(CGRect)rect {
return @{
@"x": @(rect.origin.x),
@"y": @(rect.origin.y),
@"width": @(rect.size.width),
@"height": @(rect.size.height)
};
}
+ (NSString *)label:(NSDictionary<XCUIElementAttributeName, id> *)dict
{
XCUIElementType elementType = [dict[XCUIElementAttributeNameElementType] intValue];
NSString *label = dict[XCUIElementAttributeNameLabel];
if (elementType == XCUIElementTypeTextField || elementType == XCUIElementTypeSecureTextField) {
return label;
}
return FBTransferEmptyStringToNil(label);
}
+ (NSString *)name:(NSDictionary<XCUIElementAttributeName, id> *)dict
{
NSString *identifier = dict[XCUIElementAttributeNameIdentifier];
if (nil != identifier && identifier.length != 0) {
return identifier;
}
NSString *label = dict[XCUIElementAttributeNameLabel];
return FBTransferEmptyStringToNil(label);
}
+ (NSString *)value:(NSDictionary<XCUIElementAttributeName, id> *)dict
{
id value = dict[XCUIElementAttributeNameValue];
XCUIElementType elementType = [dict[XCUIElementAttributeNameElementType] intValue];
if (elementType == XCUIElementTypeStaticText) {
NSString *label = [self label:dict];
value = FBFirstNonEmptyValue(value, label);
} else if (elementType == XCUIElementTypeButton) {
NSNumber *isSelected = [dict[XCUIElementAttributeNameSelected] boolValue] ? @YES : nil;
value = FBFirstNonEmptyValue(value, isSelected);
} else if (elementType == XCUIElementTypeSwitch) {
value = @([value boolValue]);
} else if (elementType == XCUIElementTypeTextView ||
elementType == XCUIElementTypeTextField ||
elementType == XCUIElementTypeSecureTextField) {
NSString *placeholderValue = dict[XCUIElementAttributeNamePlaceholderValue];
value = FBFirstNonEmptyValue(value, placeholderValue);
}
value = FBTransferEmptyStringToNil(value);
if (value) {
value = [NSString stringWithFormat:@"%@", value];
}
return value;
}
+ (NSDictionary *)dictionaryForElementAttributes:(NSDictionary<XCUIElementAttributeName, id> *)dict recursive:(BOOL)recursive
{
NSMutableDictionary *info = [[NSMutableDictionary alloc] init];
info[@"type"] = [FBElementTypeTransformer shortStringWithElementType:[dict[XCUIElementAttributeNameElementType] intValue]];
info[@"rawIdentifier"] = FBValueOrNull([dict[XCUIElementAttributeNameIdentifier] isEqual:@""] ? nil : dict[XCUIElementAttributeNameIdentifier]);
info[@"name"] = FBValueOrNull([self name:dict]);
info[@"value"] = FBValueOrNull([self value:dict]);
info[@"label"] = FBValueOrNull([self label:dict]);
// Handle the frame value
CGRect frame = [self handleBrokenFrameFromDict:dict[XCUIElementAttributeNameFrame]];
info[@"frame"] = NSStringFromCGRect(frame);
// Add the rect value
info[@"rect"] = [self rectDictionaryFromCGRect:frame];
info[@"isEnabled"] = [@([dict[XCUIElementAttributeNameEnabled] boolValue]) stringValue];
// visible
// accessible
info[@"isFocused"] = [@([dict[XCUIElementAttributeNameHasFocus] boolValue]) stringValue];
if (!recursive) {
return info.copy;
}
NSArray<NSDictionary<XCUIElementAttributeName, id> *> *childElements = [dict[XCUIElementAttributeNameChildren] isKindOfClass:[NSArray class]] ? dict[XCUIElementAttributeNameChildren] : @[];
if ([childElements count]) {
info[@"children"] = [[NSMutableArray alloc] init];
for (NSDictionary<XCUIElementAttributeName, id> * childSnapshot in childElements) {
[info[@"children"] addObject:[self dictionaryForElementAttributes:childSnapshot recursive:YES]];
}
}
return info;
}
Exploring Live Activity feature for Apple Watch right now and found that it has this default view with "Open on iPhone" button when you tap Live Activity. That button perfectly brings iOS app to foreground as if you tapped iOS's Live Activity.
Is there a way to mimic that behavior from inside Watch app code? From inside WKApplicationDelegate, for example
Tried openSystemURL but it seems lile it's only available for tel or sms links
I have a map application that needs to show a line (representing a direct route) that is above everything, including annotations. This is important because the map has lots of annotations (possibly hundreds) and the line is representing a route from point to another. With that many annotations being on top the line / route is basically useless because you can't see it.
I've looked at things like MKOverlayLevel but it only supports .aboveRoads or .aboveLabels. Is there a way to set the z-axis of a map overlay so that it truly is on top of everything else on the map, including annotations? And if not directly in MapKit, what other options might I have?
Worth noting that I'm targeting 16.4 and above, so that's my limitation on this
I am using iphone 11 with ios version 18.1 and I found one issue in call recording during FT audio call. Call gets dropped as soon as call recording start. This bug is also reproducible on iphone 14 pro max having same ios version. I tried it 5/5 times and it is 100% reproducible. Can you please help to fix this issue. This is really a serious quality concern as per apple standards.
I'm doing statistical formulas and need the keyboard shortcut of the symbol used to represent standard deviation (sigma), which should look like (σ).
Everything online suggests using the keyboard shortcut for option + w, but when I use that shortcut I get, ∑ instead. I've tried searching OS settings and there doesn't seem to be a place to change or determine what is the proper keyboard shortcut.
The keyboard shortcut for statistical mean (mu) is working, µ
And greater than or equal to, ≥
And less than or equal to, ≤
are also working.
I want to play remote videos using an AVPlayer in my SwiftUI App. However, I can't fix the error:
"Main thread blocked by synchronous property query on not-yet-loaded property (PreferredTransform) for HTTP(S) asset. This could have been a problem if this asset were being read from a slow network."
My code looks like this atm:
struct CustomVideoPlayer: UIViewControllerRepresentable {
let myUrl: URL
func makeCoordinator() -> Coordinator {
return Coordinator(self)
}
func makeUIViewController(context: Context) -> AVPlayerViewController {
let playerItem = AVPlayerItem(url: myUrl)
let player = AVQueuePlayer(playerItem: playerItem)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
context.coordinator.setPlayerLooper(player: player, templateItem: playerItem)
playerViewController.delegate = context.coordinator
playerViewController.beginAppearanceTransition(true, animated: false)
return playerViewController
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
}
static func dismantleUIViewController(_ uiViewController: AVPlayerViewController, coordinator: ()) {
uiViewController.beginAppearanceTransition(false, animated: false)
}
class Coordinator: NSObject, AVPlayerViewControllerDelegate {
var parent: CustomVideoPlayer
var player: AVPlayer? = nil
var playerLooper: AVPlayerLooper? = nil
init(_ parent: CustomVideoPlayer) {
self.parent = parent
super.init()
}
func setPlayerLooper(player: AVQueuePlayer, templateItem: AVPlayerItem) {
self.player = player
playerLooper = AVPlayerLooper(player: player, templateItem: templateItem)
}
}
}
I already tried creating the AVPlayerItem/AVAsset on a background thread and I also tried loading the properties asynchronously before setting the player in makeUIViewController:
let player = AVQueuePlayer(playerItem: nil)
...
Task {
let asset = AVAsset(url: myUrl)
let _ = try await asset.load(.preferredTransform)
let item = AVPlayerItem(asset: asset)
player.replaceCurrentItem(with: item)
}
Nothing seems to fix the issue (btw: the main thread is actually blocked, there is a noticable animation hitch).
Any help is much appreciated.
I try to generate PDF from view. View is nice, there is a lot of transparency and many gradients (circular and linear). But if I use ImageRenderer in in a way that documentation suggest all transparency and gradients disappear. Is this bug or some feature? Is it way to generate vector graphic from view with transparency and gradients? PDF allows those features, so why not?
State of Mind is an amazing feature, and I want to provide a similar experience to the Journal app, making it easy to record emotions.
Can you consider public State of Mind record UI api.
I'm using Core Data to save data. Then I wanna add spotlight support.
self.spotlightDelegate = StorageSpotlightDelegate(forStoreWith: description, coordinator: container.persistentStoreCoordinator)
let isSpotlightDisable = UserDefaults.standard.bool(forKey: "isSpotlightDisable")
if !isSpotlightDisable {
self.toggleSpotlightIndexing(enable: true)
}
public func toggleSpotlightIndexing(enable: Bool) {
guard let spotlightDelegate = spotlightDelegate else { return }
if enable {
spotlightDelegate.startSpotlightIndexing()
} else {
spotlightDelegate.stopSpotlightIndexing()
spotlightDelegate.deleteSpotlightIndex { error in
if let error = error {
print(error)
}
}
}
UserDefaults.standard.set(!enable, forKey: "isSpotlightDisable")
}
It works fine on an iOS15 device, but not work on iOS 17&18.
On iOS 18 devices, I can search the data when the first time to added to Core Data. But if I stop spotlight indexing and restart again, the data is never be searched.
How can I to solve this? And I noticed that the problem is also exists in another dictionary app.
For the last 2 version (18.2 beta 1 and 18.2. beta 2) I haven't been able to successfully update to either of them. I have been running Sequoia on an external drive for testing purposes and didn't have this problem with any of the 18.1 versions.
I'm currently on 18.1 (public) and when I download the update for 18.2 beta 2, the update appears to run (takes about 30 mins), preparing update, the restarts. When the Mac restarts, it just boots straight back to 18.1 without having applied the update.
I want to add a "bubble horizon" to a camera application to show if the user is keeping their phone level.
For this, I'm using the Motion Attitude functionality of CMMotionManager.
However, the output I'm getting is very inaccurate. I'm comparing it with Apple's own Measure app which is dead accurate, so the sensors are working fine. My own readings seem to be several degrees off.
Am I missing some calibration step or something?
- (void)processDeviceMotion:(CMDeviceMotion *)motion {
// use quaternions to avoid Gimbal Lock
CMQuaternion quat = motion.attitude.quaternion;
// calculate roll in degrees
double roll = atan2( 2 * ( quat.w * quat.x + quat.y * quat.z ), 1 - 2 * ( quat.x * quat.x + quat.y * quat.y ) );
roll = radiansToDegrees( roll );
NSLog( @"Roll: %f", roll );
}
Hi,
Have been trying to work with MapkitJS for a website, but I'm stumped on once basic capability: I want to be able to click on a point of interest, and perform some actions such as:
Get its coordinates
Attach an annotation to it (e.g. a callout)
In my code, PointOfInterest's are selectable:
map.selectableMapFeatures = [
mapkit.MapFeatureType.PointOfInterest,
];
But when I click on one, I do see the marker pop up but nothing else (which is not much help since there is no additional information in the marker itself). I see no event getting triggered that I can do something with.
I am using an event listener as follows:
map.addEventListener('single-tap', (event) => {
const coordinate = map.convertPointOnPageToCoordinate(event.pointOnPage);
console.log('Map tapped at:', coordinate);
console.log('Map tapped event:', event);
...
I guess I have to grab the Place ID somehow but I don't know how to.
Thanks for any help.
Hi,
WWDC24 videos have a lot of references to an "Image Playground" API, and the "What's New in AppKit" session even shows it in action, with a "ImagePlaygroundViewController". However, there doesn't seem to be any access to the new API, even with Xcode 16.2 beta. Am I missing something, or is that 'coming later'?
I am making a swift app supporting multi language ,showing proper language ui according user's phone setting language, I want to launch different screen (showing different image, boot-en.jpg, boot-ja.jpg) according language,i created two LaunchScreen files ,LaunchScreen-en.storyboard and LaunchScreen-ja.storyboard and localize them ,and add a different UIImage to them,
then create two InfoPlist.strings file with congfiging
."UILaunchStoryboardName" = "LaunchScreen_en"; //
"UILaunchStoryboardName" = "LaunchScreen_ja";//
and then **config info.plist ** with
UILaunchStoryboardName
LaunchScreen
above all steps ,build and run,hope to see launch screen showing boot-ja.jpg when phone's language is Japanese, showing boot-en.jpg when phone's language is English, but it shows black screen, how to fix this problem, thank you.
How can I test biometric on UI Tests in Swift / iOS 18? This code not working.
+ (void)successfulAuthentication {
notify_post("com.apple.BiometricKit_Sim.fingerTouch.match");
notify_post("com.apple.BiometricKit_Sim.pearl.match");
}
+ (void)unsuccessfulAuthentication {
notify_post("com.apple.BiometricKit_Sim.fingerTouch.nomatch");
notify_post("com.apple.BiometricKit_Sim.pearl.nomatch");
}
I'm using React Native to create a mobile application.When I click on a button in my app, I need to programmatically take a screenshot of the current page of my application together with the iPhone status bar that shows the time, cellular provider, and battery level. However, my app page is being captured without having the statusbar.
My 'screenshot taken' function is written in Objective-C.
Is this happening because of any privacy-related concerns?
Would you kindly assist me with this?
Attaching the screenshot code,
#import <UIKit/UIKit.h>
#import <React/RCTBridgeModule.h>
#import <React/RCTLog.h>
@interface ScreenshotModule : NSObject
@end
@implementation ScreenshotModule
RCT_EXPORT_MODULE();
RCT_REMAP_METHOD(takeStatusBarScreenshot, resolver:(RCTPromiseResolveBlock)resolve rejecter:(RCTPromiseRejectBlock)reject)
{
dispatch_async(dispatch_get_main_queue(), ^{
@try {
// Get the status bar window
UIWindow *statusBarWindow = [UIApplication sharedApplication].windows.firstObject;
UIScene *scene = [UIApplication sharedApplication].connectedScenes.allObjects.firstObject;
if ([scene isKindOfClass:[UIWindowScene class]]) {
UIWindowScene *windowScene = (UIWindowScene *)scene;
BOOL statusBarHidden = windowScene.statusBarManager.isStatusBarHidden;
if (statusBarHidden) {
NSLog(@"Status bar is hidden, app is in full-screen mode.");
} else {
NSLog(@"Status bar is visible.");
}
} else {
NSLog(@"The scene is not a UIWindowScene.");
}
// Check if the statusBarWindow is valid
if (!statusBarWindow) {
reject(@"screenshot_failed", @"Status bar window not found", nil);
return;
}
// Get the window scene and status bar frame
UIWindowScene *windowScene = statusBarWindow.windowScene;
CGRect statusBarFrame = windowScene.statusBarManager.statusBarFrame;
// Log the status bar frame for debugging
RCTLogInfo(@"Status Bar Frame: %@", NSStringFromCGRect(statusBarFrame));
// Check if the status bar frame is valid
if (CGRectIsEmpty(statusBarFrame)) {
reject(@"screenshot_failed", @"Status bar frame is empty", nil);
return;
}
// Start capturing the status bar
UIGraphicsBeginImageContextWithOptions(statusBarFrame.size, NO, [UIScreen mainScreen].scale);
CGContextRef context = UIGraphicsGetCurrentContext();
// Render the status bar layer
[statusBarWindow.layer renderInContext:context];
// Create an image from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if (!image) {
reject(@"screenshot_failed", @"Failed to capture screenshot", nil);
return;
}
// Convert the image to PNG format and then to a base64 string
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData == nil) {
reject(@"screenshot_failed", @"Image data is nil", nil);
return;
}
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
// Log base64 string length for debugging
RCTLogInfo(@"Base64 Image Length: %lu", (unsigned long)[base64String length]);
// Optionally, save the image to a file (for debugging purposes)
NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent:@"statusbar_screenshot.png"];
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
RCTLogInfo(@"Status bar screenshot saved to: %@", path);
// Resolve with the base64 image
resolve(base64String);
}
@catch (NSException *exception) {
reject(@"screenshot_error", @"Error while capturing status bar screenshot", nil);
}
});
}
@end
Hello! I hope you are all doing well. The reason for this post is to ask about the Share Sheet behavior, as I am experiencing a double Share Sheet behavior where in iOS 14.6 I give the order to download 2 files which open 2 share pop-ups (2 Share Sheets), but in iOS 17.6 it only opens once to download each file. Do you know if this changed at some point between iOS versions and why?
I leave an example image of the behavior in iOS 14.6: