Hi,
I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API.
Please see my test below:
And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements.
Thanks for your help!
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
119 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
Using gesture recognizers it is easy to implement a long-press gesture to open a menu, show a preview or something else on the iOS platform. And you can provide the duration the user must hold down the finger until the gesture recognizer fires.
But I could not yet find out how to determine the default duration for a long-press gesture that is configured in the system settings within the "accessibility" settings under "Haptic Touch" (the available options are fast, standard and slow here).
Is it possible to read out this setting, so my App can adapt to this system setting as well?
Hello,
I’m developing a sandboxed macOS app using Qt, which will be distributed via the Mac App Store. The app:
Monitors the clipboard to store copied items.
Overrides the paste function of the operating system via keyboard shortcuts.
Modifies clipboard content, replacing what the user pastes with stored data.
So, I have some questions:
Can a sandboxed app continuously read and modify clipboard content?
What entitlements are required?
What permissions should I request from the user to ensure that my app works?
Any guidance would be greatly appreciated!
Thanks in advance!
Beril Bayram
Hello Apple Community,
I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings.
I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug?
Any advice or insights would be greatly appreciated!
Thank you in advance,
Hey there! Hope you are starting the year with great joy.
My situation
I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift
My problem
I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target.
My discoveries so far
I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API.
Thank you in advanced.
I am currently developing a macOS application that listens for system-wide mouse clicks to simulate typing with user-provided text. The app requires Accessibility permissions to function properly, and I want to ensure compliance with Apple’s latest privacy and security guidelines.
The app listens to global mouse clicks.
It simulates keyboard input with user-provided text
I would like detailed guidance on the following aspects:
What specific entitlements are required to allow system-wide mouse click monitoring and simulating user input ?
App Sandbox enable or disable?
what keys required to explain global mouse click monitoring and keyboard input simulation in the info.plist
What will be the configuration of Privacy Manifest
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.
My SwiftUI app uses an Image with a tap gesture:
Image(systemName: "xmark.circle.fill")
.accessibilityIdentifier(kTextFieldClearButton)
.foregroundColor(.secondary)
.padding(.trailing, 6)
.onTapGesture {
dataSource.textFieldText = ""
}
In a UI test, I want to tap this image to execute its action:
let clearButton = app.images[kTextFieldClearButton]
clearButton.tap()
However the action is not executed.
I then set a breakpoint at clearButton.tap(), to execute lldb commands. Here are the results:
(lldb) p clearButton.isHittable
t = 439.54s Find the "TextFieldClearButton" Imag
(Bool) true
e
It is a little strange that "Image" has been interrupted by (Bool) true, but the image is hittable.
p clearButton.isAccessibilityElement
gives
(lldb) p clearButton.isAccessibilityElement
(Bool) false
I don't understand why this Image is no accessibility element. I thought, SwiftUI Views are by default accessible.
What can I do to make it accessible so that clearButton.tap() works as expected?
Following the official documentation, I'm trying to create a set of three localised Help Books.
The Help Books should be available in Spanish, English and Polish. Presently, I'm trying to complete English version.
App Structure
This is the plugin application consisting of main app and the plugin. The main app structure would looks as follows:
Files
. <XcodeProject Top>
├── Localizable.xcstrings
├── MyAppExtension
│ ├── MyAppExtension.swift
│ └── <other swift files>.swift
├──MyApp
│ ├── Info.plist
│ ├── +Array.swift
│ ├── +ButtonStyle.swift
│ ├── <other app swift files>.swift
├── Resources
└── MyApp.help
└── MyApp.help
└── Contents
├── Info.plist
└── Resources
├── English.lproj
│ ├── ExactMatch.plist
│ ├── InfoPlist.strings
│ ├── MyApp.helpindex
│ ├── MyApp.html
│ └── pgs
└── shrd
MyApp / MyApp.help / Info.plist file
Consists the following values:
Bundle name: MyApp
HPDBookAccessPath: MyApp.html
HPDBookTitle: My App Help
Default localization: en_gb
MyApp / Info.plist file
Contains the following entries:
Help Book directory name: MyApp.help
Help Book Identifier: MyApp Help
Build phase
The Copy Bundle Resources copies MyApp.help in MyApp/Resources.
Questions
Is the provided folder structure valid for creating a localised help books
Is there anything that is missing from across Info.plist files or is in the wrong places?
Why the MyApp -> Help opens the main help menu, not the app help
For practice, I have implemented accessibility labels and announcement in a very simple test app (All SwiftUI, all iOS 18).
The app is not localized, default language is English.
When running this on a German phone, odd things happen in the localization. My accessibility labels are read with an accent, but when they contain a url, the "dots" are read as the German "Punkt" (with an English Accent).
When I am providing the same text as accessibility announcement, the same text (which is in English), is read with a German voice.
I am also providing a Button with an "arrow.clockwise" image, and VoiceOver reads this, in an English Voice with "Refresh, Button". This is great and was to be expected. However, when the button is disabled, VoiceOver reads "Refresh, grau dargestellt, Button", all in an English Voice.
Is this an error? Am I doing it wrong?
The video at the link should show the issue
https://share.icloud.com/photos/0757FJW2Q3fsA_cdhMX6ls46Q
Accessibility got broken after updated till XCode 16.1
There is a call to accessibilityLabel - it sets an a11y label for a title of a view.
This used to work (pronounced by VoiceOver) with XCode 15.4 + iOS 17.5.
Xcode 16.1 + iOS 18.1 + Physical device/ iOS SImulator - with Accessibility Inspector - no a11y label set.
Tried Xcode 16.2 beta 3 - the same result - accessibilityLabel does not work - a11y label is not set.
It is outrageous that Apple continue to fail to implement the Fullscreen API web standard for web apps on iPhone only, which is so important to accessibility and web app functionality.
The only possible reason for this block is commercial: to promote iOS apps instead of browser based web apps.
To quote a client from a major agency just now - a typical enquiry :
We value accessibility greatly, and we noticed that the embedded player is missing a full screen button on iPhone.
Everything else works perfectly fine, including a full screen button that appears on the mobile webpage on android devices.
Is there any way we can include a button to enable full screen view for our viewers in your player that are going to watch it on iOS devices?
To which, as usual, we have to reply:
Apple unfortunately block fullscreen mode from being used with all web applications on iPhone.
Apple will allow this to be displayed fullscreen on MacBooks and iPads, but currently not on on iPhone - so we have to hide the fullscreen button there.
So fullscreen works on all devices and browsers apart from on iPhone.
As you've seen with Android, all other devices and browsers follow the universal 'Fullscreen API' web standard to allow full screen.
You're probably familiar with seeing the fullscreen button on normal linear videos on iPhone.
These use Apple's native video player, which doesn't let buttons and scripts be used on top of it - just a single video, not an interactive web application.
Our player looks like a video player but it is actually a web app combining multiple different video clips connected together by code and styling.
They block it on iPhones for reasons known only to them, but the assumption is that it is to incentivise people to make iOS apps instead of web apps.
The web development community is hopeful that Apple will change this unfortunate restriction soon, but we have been waiting a long time in vain.
We have to send this to a lot of people. It's a very bad look for Apple.
In less than a month it will be 2025. We have been waiting years for this.
The web standard documentation showing universal support on other devices and browsers is here:
https://developer.mozilla.org/en-US/docs/Web/API/Fullscreen_API
This is not acceptable. It is time for Apple to stop blocking this important accessibility web standard for commercial reasons - only on iPhone. To whoever is in charge of these decisions in the Safari/Webkit team: Please just enable Fullscreen API for web apps on iPhone as soon as possible.
In SwiftUI, iOS 18.1.1, Xcode 16.1, the following control:
Text(12345678, format: .byteCount(style: .binary))
displays text with MB (megabytes) unit, but German VoiceOver reads it as "millibars".
I tried explicitly specify units with:
Text(12345678, format: .byteCount(style: .memory, allowedUnits: .mb))
but the result is the same (German VoiceOver still says "millibars").
Aside from creating own accessibility label, is there any way to go around that?
Xcode is crashing various extensions on my system, most often the Accessibility extension. It's a pain because it pops up a window - usually while I'm doing something else - that takes up most of the screen and the focus.
Am I the only one this is happening to?
Hi guys,
I'm trying to add accessibility labels to a static text and custom SwiftUI views. Example:
MyView {
...
}
//.accessibilityElement()
.accessibilityElement(children: .combine)
//.accessibilityRemoveTraits(.isStaticText)
//.accessibilityAddTraits(.isButton)
.accessibilityLabel("ACCESSIBILITY LABEL")
.accessibilityHint("ACCESSIBILITY HINT")
When using 'voiceover' or 'hover text' accessibility features, focus moves only between active elements and not on static elements.
When I add .focusable() it works, but I don't want to make those elements focusable when all accessibility features are off.
I suppose I could do something like this:
.focusable(UIApplication.shared.accessibility.voiceOver.isOn || UIApplication.shared.accessibility.hoverText.isOn)
Note: this is just pseudocode, because I don't remember exactly how to detect current accessibility settings.
However using focusable() with conditions on hundreds of static texts in an app seems to be overkill. Also the accessibility focus is needed on some control containers where we already have a little more complex handling of focus with conditions in focusable(...) on parent and child elements, so extending it for accesssiblity seems to be too complicated.
Is there a simple way to tell accessiblity that an element is focusable specifically for 'hover text' and for 'voiceover'?
Example what I want to accomplish for TV content:
VStack
{
HStack {
Text(Terminator)
if parentalLock {
Image(named: .lock)
{
}
.accessibilityLabel(for: hover, "Terminator - parental lock")
Text("Sci-Fi * 8pm - 10pm * Remaining 40 min. * Live")
.accessibilityLabel(for: hover, "Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live")
}
.accessibilityLabel(for: voiceover, "Terminator, Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live, parental lock")```
I saw all Accessibility WWDC videos 2016, 2022, 2024 and googling it for several hours, but I coudln't find any solution for static texts and custom views. From those videos it appears .accessibilityLabel() should be enough, but it clearly works only on actvie elements and does not work for other SwiftUI views on tvOS without focusable().
Can this be done without using focusable() with conditions for detection which accessibility feature is on?
The problem with focusable would be that for accessibility I may need to read a text for parent view, but focus needs to be placed on a child element. I remember problems when focusable() is set on parent view that child was not focusable or something like that - simply put: complications in focus logic.
Thanks.
Description
As of iOS 18, AVAudioSession.setPreferredIOBufferDuration ignores the requested buffer size when Sound Recognition or Vocal Shortcuts is enabled. This results in 1) much larger buffer sizes and 2) mismatched buffer sizes between input and output buffers, which causes ‘glitchy’ audio and increased latency.
Additionally, when this issue occurs AVAudioSession.setPreferredIOBufferDuration continues to return ‘true’ and no error is produced.
Steps to Reproduce:
Enable Vocal Shortcuts on a device running iOS 18. Enable at least one shortcut (e.g. Control Center).
Open or clone the example project (https://github.com/cwalo/SoundRecognitionBug)
Build and install the example project
Attach a headset and launch the application
Observe console logs showing
a requested buffer size of 0.005805 (256 samples @ 48k)
an actual buffer size of 0.023220 (1104 samples @48k - this is regularly the resulting buffer size in all of our tests)
Quit the app and detach the headset. Enable mutesOutput in AudioSystem.mm (to avoid feedback)
Launch the application
Observe
Same result from step 4
Mismatched hardware buffer size of 1104 and recorded frame count of 1024
Mismatched playbackCount and recordCount
Quit the app and disable vocal shortcuts
Launch the app
Observe IOBufferDuration matching the requested duration and matched buffer sizes (expected behavior)
Expected results:
Requested IOBufferDuration is respected or AVAudioSession returns false or error is produced
Input and output buffer sizes match
Device(s): iPhone 11 Pro, iPad Pro
OS: iOS 18.0.1
Environment: Xcode 16.1
FB: FB15715421
Related to: https://forums.developer.apple.com/forums/thread/765477
Hi Team,
We are integrating SwiftUI's Charts BarMark, UI looks good but when we try setting up custom ADA it doesn't reflect/override the accessibility label/value we set manually.
Is it iOS defect or is there any workaround?
Thanks in advance.
Sample:
Chart(data) {
BarMark(
x: .value("Category", $0.department),
y: .value("Profit", $0.profit)
)
.foregroundStyle(by: .value("Product Category", $0.productCategory))
.accessibilityIdentifier("BarMark")
.accessibilityLabel("Dep: \($0.department)")
.accessibilityValue("Profile: \($0.profit) Category: \($0.productCategory)")
}
Hello everyone,
I’m experiencing an issue with the accessibility feature "Keyboard Navigation" in iOS 18.0. Specifically, when enabling the "Allow Full Keyboard Access" option and attempting to navigate to inline navigation titles within stock iOS apps, the navigation doesn’t seem to work as expected.
Here’s how to replicate the issue:
Enable the accessibility option: Allow Full Keyboard Access (Settings > Accessibility > Keyboards > Full Keyboard Access).
Open any stock iOS app that uses an inline navigation style (for example, the Mail or Settings app).
Press the Tab key to cycle through items on the inline navigation bar.
In iOS 18.0, pressing the Tab key does not allow navigation to the inline navigation title, which was previously possible in iOS 16. This issue is specific to iOS 18 and iOS 17, as it worked fine on the earlier version (iOS 16).
Has anyone else encountered this issue or have suggestions for a workaround? Would love to hear your thoughts.
Prior to iOS 18.1, App-prefs:Wallpaper deeplinked to the wallpaper settings. On iOS 18.1, it now deeplinks to the settings app. Is there a new URL to deeplink to the Wallpaper settings?
The following URLs have been tested and do not work on iOS 18.1:
App-prefs:Wallpaper
App-prefs:wallpaper
App-prefs:root=wallpaper
App-prefs:root=Wallpaper
App-prefs:root=WALLPAPER
App-prefs:root=General&path=Wallpaper
prefs:root=Wallpaper