Is there any approach or sample code available to use these APIs:
.chartScrollableAxes(.horizontal)
.chartScrollPosition(x: ...)
.chartScrollPosition(initialX: ...)
.chartScrollTargetBehavior(...)
.chartXVisibleDomain(length: ...)
Plus a gesture recognised or Pinch or Magnification to create a Swift Chart with an X axis that can be zoomed in or out with a pinch gesture? And when zoomed in at any level above 0, the chart can then be scrolled left to right along the X axis.
I've had success using .chartScrollableAxes with .chartXSelection in parallel, so would also like to keep the ability to select X values too.
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
122 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have an Electron app on macOS Sonoma (arm64 arch). It has a native addon (app.node) using node-addon-api. Recently it crashed, with the stack trace (given below). What is the AXSerializeCFType function inside the AXUIElementCopyAttributeValue function, and why did it crash there?
AXUIElementRef systemWideElement = AXUIElementCreateSystemWide();
AXUIElementRef focusedApp = NULL;
AXError error = AXUIElementCopyAttributeValue(systemWideElement, kAXFocusedApplicationAttribute, (CFTypeRef *)&focusedApp);
The crash appears to be occurring in the last line of the code, where I am retrieving the focused app AXUIElementRef using AXUIElementCopyAttributeValue. I have already attempted to manually set systemWideElement to NULL, but AXUIElementCopyAttributeValue is not crashing; it is just returning an error kAXErrorIllegalArgument.
OS Version: macOS 14.5 (23F79)
Report Version: 104
Crashed Thread: 344454
Application Specific Information:
Fatal Error: EXC_BAD_ACCESS / KERN_INVALID_ADDRESS / 0x102674000
Thread 344454 Crashed:
0 HIServices 0x18cb5d970 AXSerializeCFType
1 HIServices 0x18cb7ca24 serializeWrapper
2 HIServices 0x18cb7cd40 _AXXMIGCopyAttributeValue
3 HIServices 0x18cb74884 _AXUIElementCopyAttributeValue
4 HIServices 0x18cb74a04 AXUIElementCopyAttributeValue
5 HIServices 0x18cb747fc _AXUIElementCopyAttributeValue
6 HIServices 0x18cb74a04 AXUIElementCopyAttributeValue
7 app.node 0x1027a56f4 getFocusedApplication
Just installed iOS 18 Beta 3.
I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out.
If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app.
Calls still work since AccessibilityUIServer will release and the phone will ring.
Feed back ID is FB14241838.
I am in the process of adding my company's brand font to our SwiftUI app. I am able to implement the font using the provided public APIs so that text styles / dynamic type and the font weight modifier in SwiftUI work correctly.
However we are unable to implement custom font in such a way that text styles / dynamic type, the font weight modifier, and the bold text accessibility setting all work at the same time. Am I missing an implementation detail so that all these features work correctly?
Font Setup
The font files were modified to better support SwiftUI:
The font style name metadata was modified to match the name the .fontWeight(...) modifier expects. This was done with Typelight.
The font weight value (100/200/300) was modified so that the underlying weight value matches the value the .fontWeight(...) modifier expects. See "Using custom fonts with SwiftUI" by Matthew Flint.
The font files were imported via the Info.plist.
Examples
Font Weight Comparison
San Fransisco:
Text("#100")
.font(.largeTitle)
.fontWeight(.ultraLight)
Overpass by Name:
Text("#100")
.font(.custom("Overpass-UltraLight", size: 34, relativeTo: .largeTitle))
Overpass by Weight:
Text("#100")
.fontWeight(.ultraLight)
.font(.custom("Overpass", size: 34, relativeTo: .largeTitle))
Legibility Weight Test
When using the .fontWeight(...) modifier, the custom font does not change weights when the bold text accessibility setting is enabled. Dynamic type size works as expected.
Normal legibility weight:
Bold legibility weight:
Dynamic Type Size:
Use UIFont
Using UIFont to load the custom font files and initializing a Font with the UIFont breaks dynamic type:
Bold type also does not work:
Custom Modifier
Creating a custom modifier allows us to support dynamic type and manually handle bold text. However it creates a conflicting API to SwiftUI's .fontWeight(...) modifier.
struct FontModifier: ViewModifier {
enum UseCase {
case paragraph
case headline
}
enum Weight {
case light
case regular
case heavy
}
@Environment(\.legibilityWeight) var legibilityWeight
var useCase: UseCase
var weight: Weight
init(_ useCase: UseCase, _ weight: Weight) {
self.useCase = useCase
self.weight = weight
}
var resolvedHeadlineWeight: String {
let resolvedLegibilityWeight = legibilityWeight ?? .regular
switch weight {
case .light:
switch resolvedLegibilityWeight {
case .regular:
return "Light"
case .bold:
return "Semibold"
@unknown default:
return "Light"
}
case .regular:
switch resolvedLegibilityWeight {
case .regular:
return "Regular"
case .bold:
return "Bold"
@unknown default:
return "Regular"
}
case .heavy:
switch resolvedLegibilityWeight {
case .regular:
return "Heavy"
case .bold:
return "Black"
@unknown default:
return "Heavy"
}
}
}
var resolvedParagraphWeight: Font.Weight {
switch weight {
case .light:
return .light
case .regular:
return .regular
case .heavy:
return .heavy
}
}
var resolvedFont: Font {
switch useCase {
case .paragraph:
return .system(.largeTitle).weight(resolvedParagraphWeight)
case .headline:
return .custom("Overpass-\(resolvedHeadlineWeight)", size: 34, relativeTo: .largeTitle)
}
}
func body(content: Content) -> some View {
content
.font(resolvedFont)
}
}
GridRow {
Text("Aa")
.modifier(FontModifier(.paragraph, .regular))
Text("Aa")
.modifier(FontModifier(.paragraph, .heavy))
Text("Aa")
.modifier(FontModifier(.headline, .regular))
Text("Aa")
.modifier(FontModifier(.headline, .heavy))
}
Font Environment Value
The font environment value does not contain font weight information when the fontWeight(...) modifier is used.:
struct DumpFontTest: View {
@Environment(\.font) var font
var body: some View {
Text("San Fransisco")
.onAppear {
print("------------")
dump(font)
}
}
}
It would greatly benefit the entire community if there was a "Once" button under the "automation" section that is apart of "shortcuts" app.
Currently there are 3 selections - Daily, Weekly, Monthly.
Please add a 4th selection - "Once"
I'm asking because I was told that I could do this here.
Thanks for anyone that supports this as I feel it would benefit us immensely.
I am using WKWebview and loading HTML content with couple of tags. However, I do not see any numbers for links when I say "Show Numbers" with voice control turned on.
Any help would be appreciated. I am testing it with 17.4.1
I'm developing a macOS application called Blurt, which aims to provide enhanced notification management in MacOS. The core functionality I'm trying to implement is the ability to intercept and display notifications from various applications in a custom interface.
Current implementation:
Using AppDelegate to handle application lifecycle
Implemented UNUserNotificationCenterDelegate for handling notifications
Created a custom NotificationService extension
Challenges:
Unable to intercept notifications from other applications
System notifications are not being captured by our app
What I've tried:
Using DistributedNotificationCenter to observe system-wide notifications
Implementing a Notification Service Extension
Exploring NSWorkspace notifications
Current roadblocks:
Apple's sandboxing and security model seems to prevent direct access to other apps' notifications
Unable to find a sanctioned API for system-wide notification interception
Questions:
Is there a recommended approach to creating a centralized notification management system within Apple's guidelines?
Are there any specific system notifications or events we can legally subscribe to that might help achieve similar functionality?
How do other notification management apps (if any exist) handle this limitation?
Are there any upcoming APIs or features in macOS that might address this use case?
I'm open to alternative approaches or pivoting the app's functionality if necessary. Any insights, suggestions, or resources would be greatly appreciated.
Thank you in advance for your help!
I have added a UI test that uses the newish app.performAccessibilityAudit() on each of the screens in my SwiftUI app.
I get a handful of test failures for various accessibility issues. Some of them are related to iOS issues (like the "legal" button in a map view doesn't have a big enough tappable are; and, I think, there is a contrast issue with selected tabs in a tab bar, or maybe it is a false positive).
Two of the error type I get are around Text elements that use default font styles (like headline) but that apparently get clipped at times and apparently "Dynamic Type font sizes are unsupported".
For some of the errors it supplies a screenshot of the whole screen along with an image of the element that is triggering the error. For these text errors it only provides the overall image. After looking at the video I saw that part of the test runs the text size all the way up and I could see that at the largest size or two the text was getting cut off despite using default settings for the frames on all the Text elements.
After a bit of trial and error I managed to get the text to wrap properly even at the largest of sizes and most of the related errors went away running on an iPhone simulator. I switched to an iPad simulator and the errors re-appeared even though the video doesn't show the text getting clipped.
Does anyone have any tips for actually fixing these issues? Or am I in false positive land here? Also, is there any way to get more specific info out of the test failures? Looking at the entire screen and trying to guess where the issue is kinda sucks.
If "send and receive" on imessages has my Microsoft work email address, can my imessage content be synched with Microsoft ?
Because I recently logged into our big data system through SSH client using my work email address, on a Console I saw one of my imessage thread printed.
I reached Microsoft and their reply was to get advise reaching out to Apple support as their knowledge of iMessage is limited. Based on general knowledge about data protection the messages are most likely not synced unless there is a setting that you can allow from my side.
PLEASE HELP ! This is a Mystery !!
Hello, I'm trying to leverage PersonalVoice to read a phrase in my iOS application. My implementation works correctly on an iPhone 15, but does not work when I run the iOS application on an M2 Macbook Air.
Here are some snippets from my implementation
// This is how I request Personal Voice
AVSpeechSynthesizer.requestPersonalVoiceAuthorization() { status in
if status == .authorized {
var personalVoices = AVSpeechSynthesisVoice.speechVoices().filter { $0.voiceTraits.contains(.isPersonalVoice) }
}
}
// this is how I'm attempting to read
let utterance = AVSpeechUtterance(string:textToRead)
if let voice = personalVoices.first {
utterance.voice = voice
}
var synthesizer = AVSpeechSynthesizer()
synthesizer.speak(utterance)
I get the following error messages when I try this:
Cannot use AVSpeechSynthesizerBufferCallback with Personal Voices, defaulting to output channel.
Caller does not have kTCCServiceVoiceBanking access to personal voices. No speech will be generated
Voice not allowed to render speech! Will not set up synthesizer. Bailing now
Any suggestions on how to mitigate this issue?
Hey Apple,
We (our customer support teams) have received feedback from our customers complaining their hearing devices (hearing aids) appear to be connected to MFi and OS controls are working, but audio stream isn't working, and the app is unable to resolve a connection to the device via the CBCentralManager.retrieveConnectedPeripherals(withServices:)
The issues appear to disappear once the user unpairs and re-pair the hearing devices in the Accessibility > Hearing Devices options (they might also need to uninstall and reinstall the app as it is getting stuck due to invalid state), but we are unable to replicate this issue on our environments given it is intermittent and once we have upgraded a device to iOS 17.5.1, we don't have a mechanism to revert to an earlier version of it.
So far, we have received about 30 reports for the past 2 weeks. Most of our customers complain about the app not connecting to the devices, but the fact the audio stream is not happening could hint to a deeper problem than our app.
Are you guys aware of a problem affecting MFi hearing devices restoring after the OS upgrade process?
Hello everyone,
I'm currently working on an iOS application and I want to ensure it is as accessible as possible for all users, including those with disabilities. While I have a basic understanding of iOS accessibility features, I'm looking to deepen my knowledge and apply best practices comprehensively.
Specifically, I would like to know:
VoiceOver Optimization: What are the best practices for ensuring all UI elements are properly labeled and navigable with VoiceOver?
Color Contrast and Visual Design: How can I effectively check and adjust color contrast ratios to accommodate users with visual impairments?
Dynamic Type and Font Sizes:What are the recommended approaches for supporting Dynamic Type, and how can I ensure a consistent and readable layout across different text sizes?
Accessibility in Custom UI Elements: How can I make custom controls (e.g., custom buttons, sliders) accessible, especially when they do not follow standard UIKit components?
Testing Tools and Techniques: Which tools and techniques are most effective for QA to test accessibility on iOS?
Thank you in advance for your insights and advice.
Best regards,
Ale
I'd like to use the eye tracking feature in the latest iPadOS 18 update as more than an accessibility feature. i.e. another input modality that can be detected by event + enum checks similar to how we can detect and distinguish between touches and Apple pencil inputs. This might make it a lot easier to control and interact with iPad-based AR experiences that involve walking around, regardless of whether eye-tracking is enabled for accessibility. When walking, it's challenging to hold the device and interact with the screen with touch or pencil at all. Eye tracking + speech as input modalities could assist here.
Also, this would help us create non-immersive AR experiences that parallel visionOS experiences that use eye tracking.
I propose an API option for enabling eye-tracking (and an optional calibration dialogue within-app), as well as a specific UIControl class that simply detect when the eye looks at the control using the standard (begin/changed/end) events.
My specific use case is that I'd like to treat eye-tracking-enabled UI elements or game objects differently depending on whether something is looked at with the eyes.
For example, to select game objects while using speech recognition, suppose we have 4 buttons with the same name in 4 corners of the screen. Call them "corner" buttons. If I have my proposed invisible UI element for gaze detection, I can create 4 large rectangular regions on the screen. Then if the user says "select the corner" the system could parse this command and disambiguate between the 4 corners by checking which of the rectangular regions I'm currently looking at. (Note: the idea would be to make the gaze regions rather large to compensate for error.)
The above is just a simple example, but the advantage over other methods like dwell is that it could be a lot faster.
Another simple example:
Using the same rectangular regions, instead of speech input, I could hold a button placed in just one spot on the screen, and look around the screen with my gaze to produce a laser beam for some kind of game, or draw curves (that I might smooth-out to reduce inaccuracy). OR maybe someone who does not have their hands available.
This would require us to have the ability to get the coordinates of the eye gaze, but otherwise the other approach of just opting to trigger uicontrol elements might work for coarse selection.
Would other developers find this useful as well? I'd like to propose this feature in feedback assistant, but I'm also opening-up a little discussion if by chance someone sees this.
In short, I propose:
a formal eye-tracking API for iPadOS 18+ that allows for turning on/off the tracking within the app, with the necessary user permissions
the API should produce begin/changed/ended events similar to the existing events in UIKit, including screen coordinates. There should be a way to identify that an event came from eye-tracking.
alternatively, we should have at minimum an invisible UIControl subclass that can detect when the eyes enter/leave the region.
i build apps that act as Screen Readers to 1) add Vim motions everywhere on macOS 2) click (and more) AX Elements through the keyboard 3) scroll through the keyboard. it works extremely well with native apps. with non-native apps, i need to blast them with some extra AX Attributes (AXManualAccessibility, AXEnhancedUserInterface) to get them to expose their AX Elements. but there are a couple of apps tho which i can't get them to expose their AX Elements programmatically. now the weird thing is as soon as i start VoiceOver, those apps open up. or for some, if i use the Accessibility Inspector to go through their AX Elements, then they start opening up. so i'm wondering, is there one public known way that i'm missing to open up those apps, or is Apple using private APIs? any way i could make my apps behave like VoiceOver or the Accessibility Inspector to force those recalcitrant apps to open up?
thanks in advance.
Hi, I met an issue with the accessibility in swiftUI, did anyone met this before?
I set accessibilityValue to the button, but when the focus goes to the button, the voiceOver always says "1 update item"? I want Voiceover says the value I set. How could I achieve this?
I have a List in a NavigationStack and the list items have swipeActions
struct ListItem: Identifiable {
var id: String { return name }
let name: String
}
struct ContentView: View {
@Environment(\.editMode) var editMode
@State var listItems = [
ListItem(name: "Cow"),
ListItem(name: "Duck"),
ListItem(name: "Chicken")
]
var body: some View {
NavigationStack {
List(listItems) { listItem in
NavigationLink {
Text(listItem.name)
} label: {
Text(listItem.name)
}
.swipeActions(edge: .trailing) {
Button(action: { }, label: { Text("Edit") }).tint(.blue)
Button(role: .destructive, action: {}, label: { Text("Delete") })
}
}
.navigationTitle("Animals")
.navigationBarItems(trailing: EditButton())
}
}
}
For users who might have problems with the swipe gesture, I would like to have the Swipe Actions display when the list is in editMode.
Is there a programmatic way to show the swipeActions?
I have also tried using ForEach with an onDelete modifier. Then if you hit the edit button, it shows a circular button on the left for each row. If you click the circular button then it will show the swipe actions but I don't want the extra step of pressing the circular button.
The triple click accessibility shortcut is assigned to trigger the assistive touch icon , this works very rarely as intended and triggers the shut down screen.
I have tried assigning different shortcuts but it the shut down screen is always triggered.
On restarting the phone , it seems to work a few more times as intended , but then again reverts to the bugged state.
I have create a feedback (FB13459492) for this in December 2023 but haven't received any reply. Updated recently with a screen recording.
I have an app developed using ElectronJS that requires Accessibility permission to monitor mouse and keyboard events through the iohook package. I want to publish it on the Mac App Store, but it seems that:
The Mac App Store mandates Sandboxing, and
Sandboxing prohibits Accessibility permission.
As a result, it seems that an app on the Mac App Store cannot obtain Accessibility permission.
Can someone confirm if this is accurate or if there's a workaround?
Hello I have this issue where I have "shortcut keys" that I have added to a ViewController. When navigated to the ViewController I cannot use the "shortcut keys" right away. I first need to "press" anywhere on the screen to be able to use the shortcut.
I use this code:
- (void)viewDidLoad {
[super viewDidLoad];
UIKeyCommand* commandA = [UIKeyCommand keyCommandWithInput:@"a" modifierFlags:UIKeyModifierCommand action:@selector(handleCommand:)];
[self addKeyCommand:commandA];
}
- (void)handleCommand:(UIKeyCommand *)keyCommand {
// Hangup call.
[self btnIncomingAnswerPressed:NULL];
}
I am not sure why this happens, but I it would be really good if a user could use a shortcut key as soon as they navigate to the ViewControllor, instead of first have to press anywhere on the app.
Hi there! folks. Hope you are fine
We want our two applications to listen to connection and disconnection notifications in the External accessory when we connect a device to USB
These application are running in the background and we need to know when the device connects and disconnects from USB. In those cases, we have configured our apps to listen to local notifications as follows:
EAAccessoryManager.shared().registerForLocalNotifications()
NotificationCenter.default.addObserver(self, selector: #selector(didConnectAccessory(_:)), name: Notification.Name.EAAccessoryDidConnect, object: nil)
NotificationCenter.default.addObserver(self, selector: #selector(didDisconnectAccessory(_:)), name: Notification.Name.EAAccessoryDidDisconnect, object: nil)
When the apps are running in foreground, everything works correctly. On the other hand, when the apps are running in the background mode and with the external accessory background mode enabled, the disconnection or connection event is not sent to the applications, causing them to not be able to initialize correctly.
We attach traces of the two applications running in the background in a connection event through the USB.
in The first application the Notification event is received correctly.
In The second application the Notification event is NOT received correctly
We would like to investigate with you what may be happening.
We have opened a case in the feedback assistant (FB13800710) so you can investigate further. We open the case here so that other people can collaborate with us in depth. Thank you so much
Looking forward to hearing from your side.
All the best!
LogsApplication