Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

133 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

(After upgrading to iOS 17.5.1) MFi hearing devices appeared to be paired, but app is unable to resolve the connection to the peripheral
Hey Apple, We (our customer support teams) have received feedback from our customers complaining their hearing devices (hearing aids) appear to be connected to MFi and OS controls are working, but audio stream isn't working, and the app is unable to resolve a connection to the device via the CBCentralManager.retrieveConnectedPeripherals(withServices:) The issues appear to disappear once the user unpairs and re-pair the hearing devices in the Accessibility > Hearing Devices options (they might also need to uninstall and reinstall the app as it is getting stuck due to invalid state), but we are unable to replicate this issue on our environments given it is intermittent and once we have upgraded a device to iOS 17.5.1, we don't have a mechanism to revert to an earlier version of it. So far, we have received about 30 reports for the past 2 weeks. Most of our customers complain about the app not connecting to the devices, but the fact the audio stream is not happening could hint to a deeper problem than our app. Are you guys aware of a problem affecting MFi hearing devices restoring after the OS upgrade process?
1
0
126
1d
Seeking Best Practices for Enhancing Accessibility in iOS Applications
Hello everyone, I'm currently working on an iOS application and I want to ensure it is as accessible as possible for all users, including those with disabilities. While I have a basic understanding of iOS accessibility features, I'm looking to deepen my knowledge and apply best practices comprehensively. Specifically, I would like to know: VoiceOver Optimization: What are the best practices for ensuring all UI elements are properly labeled and navigable with VoiceOver? Color Contrast and Visual Design: How can I effectively check and adjust color contrast ratios to accommodate users with visual impairments? Dynamic Type and Font Sizes:What are the recommended approaches for supporting Dynamic Type, and how can I ensure a consistent and readable layout across different text sizes? Accessibility in Custom UI Elements: How can I make custom controls (e.g., custom buttons, sliders) accessible, especially when they do not follow standard UIKit components? Testing Tools and Techniques: Which tools and techniques are most effective for QA to test accessibility on iOS? Thank you in advance for your insights and advice. Best regards, Ale
1
0
157
4d
Eye Tracking as a General API Feature for iPadOS 18+?
I'd like to use the eye tracking feature in the latest iPadOS 18 update as more than an accessibility feature. i.e. another input modality that can be detected by event + enum checks similar to how we can detect and distinguish between touches and Apple pencil inputs. This might make it a lot easier to control and interact with iPad-based AR experiences that involve walking around, regardless of whether eye-tracking is enabled for accessibility. When walking, it's challenging to hold the device and interact with the screen with touch or pencil at all. Eye tracking + speech as input modalities could assist here. Also, this would help us create non-immersive AR experiences that parallel visionOS experiences that use eye tracking. I propose an API option for enabling eye-tracking (and an optional calibration dialogue within-app), as well as a specific UIControl class that simply detect when the eye looks at the control using the standard (begin/changed/end) events. My specific use case is that I'd like to treat eye-tracking-enabled UI elements or game objects differently depending on whether something is looked at with the eyes. For example, to select game objects while using speech recognition, suppose we have 4 buttons with the same name in 4 corners of the screen. Call them "corner" buttons. If I have my proposed invisible UI element for gaze detection, I can create 4 large rectangular regions on the screen. Then if the user says "select the corner" the system could parse this command and disambiguate between the 4 corners by checking which of the rectangular regions I'm currently looking at. (Note: the idea would be to make the gaze regions rather large to compensate for error.) The above is just a simple example, but the advantage over other methods like dwell is that it could be a lot faster. Another simple example: Using the same rectangular regions, instead of speech input, I could hold a button placed in just one spot on the screen, and look around the screen with my gaze to produce a laser beam for some kind of game, or draw curves (that I might smooth-out to reduce inaccuracy). OR maybe someone who does not have their hands available. This would require us to have the ability to get the coordinates of the eye gaze, but otherwise the other approach of just opting to trigger uicontrol elements might work for coarse selection. Would other developers find this useful as well? I'd like to propose this feature in feedback assistant, but I'm also opening-up a little discussion if by chance someone sees this. In short, I propose: a formal eye-tracking API for iPadOS 18+ that allows for turning on/off the tracking within the app, with the necessary user permissions the API should produce begin/changed/ended events similar to the existing events in UIKit, including screen coordinates. There should be a way to identify that an event came from eye-tracking. alternatively, we should have at minimum an invisible UIControl subclass that can detect when the eyes enter/leave the region.
0
1
237
5d
AX Elements in some apps only exposed when using VoiceOver or Accessibility Inspector
i build apps that act as Screen Readers to 1) add Vim motions everywhere on macOS 2) click (and more) AX Elements through the keyboard 3) scroll through the keyboard. it works extremely well with native apps. with non-native apps, i need to blast them with some extra AX Attributes (AXManualAccessibility, AXEnhancedUserInterface) to get them to expose their AX Elements. but there are a couple of apps tho which i can't get them to expose their AX Elements programmatically. now the weird thing is as soon as i start VoiceOver, those apps open up. or for some, if i use the Accessibility Inspector to go through their AX Elements, then they start opening up. so i'm wondering, is there one public known way that i'm missing to open up those apps, or is Apple using private APIs? any way i could make my apps behave like VoiceOver or the Accessibility Inspector to force those recalcitrant apps to open up? thanks in advance.
1
1
108
1w
Show `swipeActions` in `editMode`
I have a List in a NavigationStack and the list items have swipeActions struct ListItem: Identifiable { var id: String { return name } let name: String } struct ContentView: View { @Environment(\.editMode) var editMode @State var listItems = [ ListItem(name: "Cow"), ListItem(name: "Duck"), ListItem(name: "Chicken") ] var body: some View { NavigationStack { List(listItems) { listItem in NavigationLink { Text(listItem.name) } label: { Text(listItem.name) } .swipeActions(edge: .trailing) { Button(action: { }, label: { Text("Edit") }).tint(.blue) Button(role: .destructive, action: {}, label: { Text("Delete") }) } } .navigationTitle("Animals") .navigationBarItems(trailing: EditButton()) } } } For users who might have problems with the swipe gesture, I would like to have the Swipe Actions display when the list is in editMode. Is there a programmatic way to show the swipeActions? I have also tried using ForEach with an onDelete modifier. Then if you hit the edit button, it shows a circular button on the left for each row. If you click the circular button then it will show the swipe actions but I don't want the extra step of pressing the circular button.
1
0
67
1w
Triple Click Accessibility Shortcut triggers the Shut Down Screen
The triple click accessibility shortcut is assigned to trigger the assistive touch icon , this works very rarely as intended and triggers the shut down screen. I have tried assigning different shortcuts but it the shut down screen is always triggered. On restarting the phone , it seems to work a few more times as intended , but then again reverts to the bugged state. I have create a feedback (FB13459492) for this in December 2023 but haven't received any reply. Updated recently with a screen recording.
1
0
285
1w
Is it possible for an app in the Mac App Store to obtain Accessibility permissions?
I have an app developed using ElectronJS that requires Accessibility permission to monitor mouse and keyboard events through the iohook package. I want to publish it on the Mac App Store, but it seems that: The Mac App Store mandates Sandboxing, and Sandboxing prohibits Accessibility permission. As a result, it seems that an app on the Mac App Store cannot obtain Accessibility permission. Can someone confirm if this is accurate or if there's a workaround?
4
1
297
3w
Keyboard shortcuts iPad cannot be used until screen is pressed.
Hello I have this issue where I have "shortcut keys" that I have added to a ViewController. When navigated to the ViewController I cannot use the "shortcut keys" right away. I first need to "press" anywhere on the screen to be able to use the shortcut. I use this code: - (void)viewDidLoad { [super viewDidLoad]; UIKeyCommand* commandA = [UIKeyCommand keyCommandWithInput:@"a" modifierFlags:UIKeyModifierCommand action:@selector(handleCommand:)]; [self addKeyCommand:commandA]; } - (void)handleCommand:(UIKeyCommand *)keyCommand { // Hangup call. [self btnIncomingAnswerPressed:NULL]; } I am not sure why this happens, but I it would be really good if a user could use a shortcut key as soon as they navigate to the ViewControllor, instead of first have to press anywhere on the app.
2
1
318
May ’24
App running in background with External accessory background mode enabled doesn't receive sometimes the EAAccessoryDidConnect or the EAAccessoryDidDisconnect notification
Hi there! folks. Hope you are fine We want our two applications to listen to connection and disconnection notifications in the External accessory when we connect a device to USB These application are running in the background and we need to know when the device connects and disconnects from USB. In those cases, we have configured our apps to listen to local notifications as follows: EAAccessoryManager.shared().registerForLocalNotifications() NotificationCenter.default.addObserver(self, selector: #selector(didConnectAccessory(_:)), name: Notification.Name.EAAccessoryDidConnect, object: nil) NotificationCenter.default.addObserver(self, selector: #selector(didDisconnectAccessory(_:)), name: Notification.Name.EAAccessoryDidDisconnect, object: nil) When the apps are running in foreground, everything works correctly. On the other hand, when the apps are running in the background mode and with the external accessory background mode enabled, the disconnection or connection event is not sent to the applications, causing them to not be able to initialize correctly. We attach traces of the two applications running in the background in a connection event through the USB. in The first application the Notification event is received correctly. In The second application the Notification event is NOT received correctly We would like to investigate with you what may be happening. We have opened a case in the feedback assistant (FB13800710) so you can investigate further. We open the case here so that other people can collaborate with us in depth. Thank you so much Looking forward to hearing from your side. All the best! LogsApplication
3
0
284
2w
Accessibility Hint When There are Multiple Actions
Hi there -- I'm cleaning up the accessibility in our app and making sure it adheres to Apple's suggested guidelines. For accessibilityHint, apple lists a couple of suggestions in the doc here: https://developer.apple.com/documentation/objectivec/nsobject/1615093-accessibilityhint Notably this one is one that I'm having to change a lot in our app: Don’t include the action type in the hint. For example, don't create hints like “Tap to play the song” or “Tapping plays the song.” However, we have some buttons that do different actions based on a double or triple tap in VoiceOver, so our hint looks something like: "Double tap to do X, Triple Tap to do Y" This violates the accessibilityHint guidelines, but I feel like changing this would mean the customer loses out on valuable information. What does apple suggest we do in this case? Thanks in advance!
2
0
234
May ’24
Is it possible to enable Dwell Control on Vision OS within your app?
Hi everyone, I'm currently developing an application for VisionOS and I'm interested in implementing Dwell Control to improve accessibility for users with limited mobility. Specifically, I would like to include a toggle within my app's interface that allows users to enable or disable Dwell Control at the app level. I've gone through the VisionOS documentation and the general accessibility guidelines, but I couldn't find detailed information on how to programmatically enable or disable Dwell Control within an app. Here are my main questions: Is it possible to programmatically enable or disable Dwell Control from within a VisionOS app? If so, what are the specific API calls or methods needed to achieve this functionality? Are there any best practices or additional resources for implementing Dwell Control in VisionOS that you could point me to? Thanks!
1
0
288
May ’24
why SwiftUI accessibility not supporting screen Simplified Chinese language?
It seems that when you tested the official code today, you found that screen reading works fine with English characters, but when using Chinese characters, the Text and Button elements cannot be read correctly.. It's important to address this issue to bring convenience to your blind friends. The official code address is:https://developer.apple.com/documentation/swiftui/creating_accessible_views.
2
0
219
May ’24
Custom rotor is always playing end-of-list sound
I have a custom rotor that changes the skim speed of the skim forward/backward feature of my audio player. The rotor works, but it's always playing an end-of-list sound. Here is the code: // Member variables private let accessibilitySeekSpeeds: [Double] = [10, 30, 60, 180] // seconds private var accessibilitySeekSpeedIndex: Int = 0 func seekSpeedRotor() -> UIAccessibilityCustomRotor { UIAccessibilityCustomRotor(name: "seek speed") { [weak self] predicate in guard let self = self else { return nil } let speeds = accessibilitySeekSpeeds switch predicate.searchDirection { case .previous: accessibilitySeekSpeedIndex = (accessibilitySeekSpeedIndex - 1 + speeds.count) % speeds.count case .next: accessibilitySeekSpeedIndex = (accessibilitySeekSpeedIndex + 1) % speeds.count @unknown default: break } // Return the currently selected speed as an accessibility element let accessibilityElement = UIAccessibilityElement(accessibilityContainer: self) let currentSpeed = localizedDuration(seconds: speeds[accessibilitySeekSpeedIndex]) accessibilityElement.accessibilityLabel = currentSpeed + " seek speed" UIAccessibility.post(notification: .announcement, argument: currentSpeed + " seek speed") return UIAccessibilityCustomRotorItemResult(targetElement: accessibilityElement, targetRange: nil) } } The returned accessibility element isn't read out, and instead an end-of-list sound is played. I can announce the change manually using UIAccessibility.post, but it still plays the end-of-list sound. How can I prevent the rotor from playing the end-of-list sound?
2
0
230
May ’24
Voiceover with Indian voice
Hi, I am facing issue where voiceover accessibility does not work for some of the labels if I select any Indian voice from settings (Accessibility -> voiceover -> speech -> voice -> ENGLISH (INDIA)), it works for other countries' voices though for the same label. Caption panel also shows correct accessibility label but it just doesn't announce it.
2
0
272
Apr ’24
Problem with Accessibility Inspector and app on PySide6
Hi to all. I'm building a Mac application using QT (PySide6). After updating XCode to 15.3.0 (Sonoma), the Accessibility Inspector stopped working correctly, it does not display the attributes of nested elements, but only the title of the window. When using XCode 14.2 (Monterey) - everything worked correctly. main.py from PySide6.QtGui import QGuiApplication from PySide6.QtQml import QQmlApplicationEngine app = QGuiApplication(sys.argv) engine = QQmlApplicationEngine() engine.quit.connect(app.quit) engine.load("main.qml") sys.exit(app.exec()) main.qml import QtQuick.Controls ApplicationWindow { visible: true width: 600 height: 500 title: "MyApp" Rectangle { anchors.fill: parent Text { id: my_text anchors.centerIn: parent text: "My APP" font.pixelSize: 45 Accessible.role: Accessible.StaticText Accessible.name: my_text.text Accessible.description: "my app text" } Accessible.role: Accessible.StaticText Accessible.name: "Rectagle" Accessible.description: "my app rectangle" } } The code is the same for both applications On Monterey i can select any region with accessible On Sonoma - only title Does anyone know a solution? Maybe someone has a similar problem?
4
0
305
Apr ’24
XCUITest fails because of accessibility not loading
Hi all, I ran into an XCUITest issue where my tests fail randomly with the message that my app "has not loaded accessibility" I run the tests in my scheme in random order and with every run, some other random test fails and a test that previously failed would later pass. So I know it's not my test as-such that is causing the problem. There seems to be a 60s wait for accessibility to load, and I have not found any information how I could extend the timeout or convince accessibility to load faster. I found some older posts, but no real solution was found there either. This completely blocks my test execution since I cannot get a single run in where the tests all pass (and I only have 10 tests in that scheme). I ran the scheme against an iOs sim and against a real iOS device and I get the same random accessibility errors on either execution platform. I've tried iOS17.2 & 17.4 and get the same behavior. Strange enough, the test actually keeps running (and technically passes all the checks and asserts in the actual test) after the accessibility error, but then XCTest marks it as failed because of the early accessibility error. Any ideas what I could try or what the reason could be? Thanks! -- I'm using XCode Version 15.3 (15E204a) and test iOS17.2 and iOS17.4 Test Case '-[MyScheme.MyTestClass testMyFunction]' started. t = 0.00s Start Test at 2024-04-19 01:32:09.225 t = 0.02s Set Up t = 0.02s Open com.myCompany.myApp t = 0.03s Launch com.myCompany.myApp t = 0.25s Wait for accessibility to load t = 60.30s Capturing diagnostic spindump /Users/some/path/goes/here/MyTestCase.swift:120: error: -[MyScheme.MyTestClass testMyFunction] : Application 'com.myCompany.myApp' has not loaded accessibility t = 60.34s Waiting 60.0s for "test" Button to exist t = 61.38s Checking `Expect predicate `exists == 1` for object "test" Button` t = 61.39s Checking existence of `"test" Button` ... more stuff happening in the test here ...
1
0
364
Apr ’24
Assigning accessibility label for UIActivityViewController
I'm trying to get voiceover to announce accessibility label for UIActivityViewController (for a share sheet that pops up on clicking share button) before the focus goes to one of activity items inside the UIActivityViewController. In order to achieve the above behavior, I'm just assigning accessibilityLabel for UIActivityViewController's view. But looks like I'm not able to override the default accessibility implementation of UIActivityViewController. Is there any way we could override the existing accessibility behavior of UIActivityViewController and announce accessibility label for share options? Thanks in advance!!
1
0
260
Apr ’24