Hi, I'm currently registering notifications on numerous AXUIElementRefs. I would like to find a timestamp of when each event occurs, however, I cannot find a reliable way to do so.
Getting a timestamp when the callback is called isn't reliable because the order of callback execution is arbitrary. I know the run loop API is mostly open sourced, and this is a bit of a reach, but is it possible to hook into the CFRunLoopSourceSignal call from the AXObserverRef?
Somewhere in the Apple API stack these notifications are being triggered. My question is, do they record the timestamp and are there any public or private APIs to gather this information?
My goal is to reliably gather in what order certain events happen (e.g. window move, focus, etc.).
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
122 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm working on converting an app to SwiftUI, and I have a menu that used to be several table cells in a storyboard, but I moved it to an embedded SwiftUI view instead.
Here's the old way (from override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell ):
cellReuseID = "BillingToolsCell"
let cell = tableView.dequeueReusableCell(withIdentifier: cellReuseID, for: indexPath)
if let billingToolsCell = cell as? BillingToolsCell {
billingToolsCell.billingToolsOptions.text = billingTools[indexPath.row].title
// Accessibility
billingToolsCell.isAccessibilityElement = true
billingToolsCell.accessibilityIdentifier = "Billing_\(billingTools[indexPath.row].title.replacingOccurrences(of: " ", with: ""))"
}
return cell
And here's the new way I'm creating the cell:
cellReuseID = "BillingToolsSwiftUI"
if let cell = tableView.dequeueReusableCell(withIdentifier: cellReuseID, for: indexPath) as? SwiftUIHostTableViewCell<BillingToolsView> {
let view = BillingToolsView(billingToolVM: BillingToolViewModel()) { segueID in
self.performSegue(segueID: segueID)
}
cell.host(view, parent: self)
return cell
}
Here's the swiftUI view:
struct BillingToolsView: View {
@StateObject var billingToolVM: BillingToolViewModel
var navigationCallback: (String) -> Void
var body: some View {
VStack {
VStack{
ForEach(self.billingToolVM.billingToolList, id: \.self) { tool in
Button {
navigationCallback(tool.segueID)
} label: {
BillingToolsRowView(toolName: tool.title)
Divider().foregroundColor(AFINeutral800_SwiftUI)
}
.accessibilityIdentifier("Billing_\(tool.title.replacingOccurrences(of: " ", with: ""))")
}
}
.padding(.vertical)
.padding(.leading)
.background(AFINeutral0_SwiftUI)
}
}
}
If I check the accessibility inspector, I can see the identifier - here it is showing Billing_PaymentHistory:
But when the testers try to run their tests in Appium, they don't see any identifier at all:
Did I mess up setting up the accessibility identifier somehow? Or do the testers need to update their script?
Hi all,
Since iOs 17.2 all the with the attribute role="text" and attr.aria-label="anytext" are not read by voiceover anymore.
Example code:
Could you please help me understand how to solve this issue?
I have collection view with hierarchical data source. Because of that, I create some cells with UICellAccessoryOutlineDisclosure accessory with style UICellAccessoryOutlineDisclosureStyleCell, so one can either tap on cell to open detail view or tap on outline disclosure accessory to reveal hidden child data.
Question:
How should I configure outline disclosure accessory to work with VoiceOver on?
It works fine without VoiceOver, but with VoiceOver it seems, that any gesture always leads to opening detail view.
I have 5 sticker packs in the App Store. I had an older Mac and it finally was too old for more MacOS updates and therefore too old to update xCode, so I haven't done any updates to my packs or looked at xCode in nearly 3 years. I FINALLY got a new Mac. I've got xCode 15 installed and with latest updates - and it looks so foreign!
Things I can't find:
Where in xCode can I change the version and build #? This used to be so obvious. I decided to start from scratch with my project. Clicked on new Sticker Pack App. Dragged in my icons and stickers and new updates I've created. When I went to archive, it says it can't because it already exists. Oh boy.
In addition to this, I'm also lost on how to put in ALT tags for accessibility. This was also super obvious in the version of xCode I was using 3 years ago - I could click on each sticker and in the right pane I could put in the words for voice over for visually impaired. Now that is gone. One of my reviews thanked me for making my sticker pack accessible. I don't want to lose that ability - but I cannot find out where the heck it's hiding. The OnDemand Resource Tags definitely aren't it - since adding info in one puts the same tags on ALL the stickers.
Hello,
We can export and save a great summary audit result in HTML by using the Accessibility Inspector. Is there any way we could have the same audit result from the UI tests and integrate it into the CI/CD so that we can have a monthly audit report for our app designer?
I have been updating my application periodically for 10 months and it always appeared when I search for it from my iPhone as "plattaforma" (yes, with a double T) but for a month now it only appears if I search for it as "plattaforma.com" or if I add the category it is to the name "productivity" or the development account to which it belongs, leaving "Plattforma productividad", could it be because the name of my application is very generic? because now they are coming out with platform type games like Mario Bros.
I have reviewed the configuration in Xcode and everything is the same as in previous months and from the Connect app store everything remains the same
What can I do? I appreciate your help.
We recently updated to Xcode 15.1 and start using iOS 17.2 simulator and ran into a blocker issue with out UITests. Setting accessibility identifier on a UIButton with image no longer works. It seems iOS automatically set the label to the file name used for the button and ignore (overwrite) the id and value we set in code. iOS 17.0 with Xcode 15.1 still works. I spent 2 days on this and still cannot find a solution. Anyone had similar issue?
I did file a feedback https://feedbackassistant.apple.com/feedback/13515676
Thanks!
Hi everyone, do you know when real-time subtitles for Italian language calls will be implemented in iPhone?
I am working on an iOS app which has voiceOver (speak out) and voice control (voice command) features. Currently this is working fine when uses turned ON both voiceOver and voice control in settings > accessibility.
Now, the client asked me, do same functionality even both voiceOver and voice control turned OFF in settings.
Please suggest me will apple approve this if I use third parties library code to work voiceOver and voice control? If Apple has no issue, then please tell me some good library names.
Many thanks.
I'm writing a program that requires accessibility permissions, and I use AXIsProcessTrustedWithOptions to confirm and direct the user to the accessibility interface for authorization. According to conventional implementation, after the accessibility interface is opened, the system will automatically add the corresponding program, and the user only needs to turn on the permissions. But now, with accessibility turned on, I don't see my program loading automatically.
Next, I tried to add it manually, click + on the accessibility interface, select the program I built and then add it. But when I clicked Add, my program was not successfully added. The accessibility interface still did not see my program being loaded, and the user could not open permissions for it.
I think this has nothing to do with whether it is a development version, because I tried other debug programs developed and compiled with xcode and they were able to be added normally. So I want to know, what are the reasons why it cannot be added? Which parts should I check from?
on iOS, when VoiceOver is running, double click the UITextView can quickly switch the cursor between the beginning and end of the text.
Problem
the problem is, when the text contains some NSTextAttachment, this ability has failed.
Here is some code to reproduce this problem.
import UIKit
class ViewController: UIViewController {
private var textView: UITextView?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
view.backgroundColor = UIColor.white
textView = UITextView(frame: CGRect(x: 60, y: 220, width: 240, height: 180))
textView?.layer.borderColor = UIColor.black.cgColor
textView?.layer.borderWidth = 1
textView?.font = .systemFont(ofSize: 18)
view.addSubview(textView!)
let button = UIButton(type: .system)
button.frame = CGRect(x: 120, y: 120, width: 60, height: 60)
button.setTitle("click", for: .normal)
button.addTarget(self, action: #selector(self.click), for: .touchUpInside)
view.addSubview(button)
}
@objc func click() {
let emojiImage = UIImage(systemName: "iphone.circle")
let lineHeight = UIFont.systemFont(ofSize: textView!.font?.pointSize ?? 0.0).lineHeight
let insertedAttr = generateAttributedString(image: emojiImage, bounds: CGRect(x: 0, y: -4, width: lineHeight, height: lineHeight))
let attr = NSMutableAttributedString(attributedString: textView!.attributedText)
attr.replaceCharacters(in: textView!.selectedRange, with: insertedAttr)
textView!.attributedText = attr
}
public func generateAttributedString(image: UIImage?, bounds: CGRect) -> NSMutableAttributedString {
let attachment = NSTextAttachment(data: nil, ofType: nil)
attachment.bounds = bounds
attachment.image = image
let attrString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment))
attrString.addAttribute(.font, value: textView!.font!, range: NSRange(location: 0, length: attrString.length))
return attrString
}
}
<!! Caution, I'm not good at English, you may unable to understand for reading and understanding. !!>
I'm a researcher and consultant for the web & mobile-native platform app accessibility in South Korea. in our team, each member takes charge for each platform, web, android, iOS. I'm in charge swiftUI.
I tried to make a list that can reorder following next steps.
I made a Custom View struct adopting View Protocol
I declared variable property named 'fruit', type of [String] as @State.
I declared VStack at the body as root layout.
List container is in the VStack. ForEach(DynamicContentView) is iterated as Text() component into that List.
for the 'ForEach', I used 'onMove(perform:)' modifier for implementing reorder by drag and drop.
And last, for the 'List' container, I used 'environment' modifier to assign a 'EditMode.active' value to 'editMode' environment.
There didn't seem to be any problems without using VoiceOver. but, there's some problem with VoiceOver.
VoiceOver doesn't announce a dropped item's location(e.g. moved above Item, moved below Item)
No custom action for move included there. in UIKit, Each reorder button has a move actions(move up, move down).
How can I solve this on swiftUI?
Hey all, I was hoping to find a solution to this issue with voiceover and if there isn't one then raise a radar as it may be a bug.
Across our app, we have some SwiftUI.Text elements that are set with AttributedString (Text(someAttributedString)) and can contain one or more links.
When no custom .accessibilityLabel is set on the Text, when that element is focused on with voiceover, the text is read, any links within the text read "link" after, and when the text finishes reading, "Use the rotor to access links" is read, the rotor has a "Links" option, and selecting that rotor option will allow the links within that AttributedString to be selected and opened. All good!
However, we have some instances where we want slightly modified versions of that AttributedString to be used for the accessibility label, for example when we want to fix pronunciation of certain brand or non standard words. To do this, we have something like:
Text(someAttributedString)
.accessibilityLabel(Text(someModifiedAttributedString))
Although this originally worked and read out the "link", "Use the rotor to access links", and links could be focused and selected, this no longer seems to work.
"link" will still be read out after the link portion of the text, but "Use the rotor to access links" will not be, and the "Links" option won't be available on the rotor.
I believe this issue was introduced with the iOS17 SDK/Xcode 15.
Has the API changed and theres something else we need to set here?
Or is this a bug with the iOS 17 SDK?
Thanks!
iPhone 12 mini; iOS 17.1.2.
I'm having issues with the new version of iBook:
randomly distributed extra blank pages throughout .epub's (but not with .pdf)
The rest are issues with iBook using Accessibility/Spoken Content.
reading stops at the extra blank pages mentioned above
reading stops at the beginning of new chapters
some, but not all, commas are said aloud during the reading
some apostrophe s are said as "ess" instead of as part of the word that precedes them, eg: the sun's up
apostrophe ll is said as "el el" instead of as part of the word that precedes them, eg: the sun'll get you
sometimes (consistently for where this happens) the first couple of words on a new page will be repeated. Not positive, but it might be with the ends of sentences that only have a couple of words left for the next page, or perhaps, dialog vs prose, eg: in (page a) "How much would (page b) I need?", 'I need' is repeated, so it sounds like "How much would I need I need?"
and perhaps the most frustrating: I'm using the Siri 1 voice, and almost once a day, when I start listening, it has switched to Siri 4, and Siri 1 needs to be re-installed in order to use it.
On a related note: When using AirDrop to move files from my Mac to my iPhone, I really miss the menu that allowed me to put the file where I wanted it (Books) without the need to download it to Files then share it to Books. I was truly disturbed that it came configured to send my private material directly onto iCloud storage, which was a pain to figure out how to store locally. It felt like an underhanded move to force me to use iCloud, and I didn't appreciate it. I found others online who felt the same way, and who struggled, like me, to figure out how to use local storage.
As an author, I use the Spoken Content as part of my editing, which is why I am so aware of these issue. More importantly, I want to ensure my blind/visually-restricted audience has access to my book in a listenable format. Initially, with the first problem (random blank pages), I thought there was something wrong with my book formatting, but then I took the time to listen to other .epubs in the iBook app, and found the same problem.
I hope this feedback helps you improve the Spoken Content, as well as the iBook app. I always cringe when a new major version of iOS comes out, because of the surprises and the multiple revisions it takes to get past the introduced bugs. I will end by saying the Siri 1 voice has come a long way from when I first began using it in 2020 with respect to it sounding more natural. And the ability to have my books read aloud for folks who can't afford expensive readers is priceless.
Hi all, it's been already few weeks since i bought my new Macbook pro m3. Everything works great but i have some issues in Accessibility settings, i need live captions because i have hearing issues, i tried to download them many times, also called accessibility support and they didn't help me to fix it.
After enabling live captions option the live captioning window showing up with "Downloading language" then in less than one second changing to "Error in downloading"
I cannot upload an image here because of error "An error occured while uploading this image. Please try again later."
Example: https://i.imgur.com/EMKB0Go.jpg
Does anyone know how to fix this? I tried to create a new user, safe mode, and change network, support agent also had an access to my mac and made some troubleshooting actions but didn't work. By the way It works on my iPhone perfectly.
Also i would like to connect my MFI hearing device without connecting extra accessories, like to my iphone but seems it's not supporting by Cochlear for MacOS, yet. (Nucleus 7)
But it's works great on ipadOS, iOS and watchOS, but not in macOS... what a nonsense.
Please help
When Full Keyboard access is enabled, the currently focused element is indicated by a thick border (first screenshot below). If the focused element is inside a focus group, e.g. a UIScrollView, then the thick border encloses the entire focus group, and the focused element is indicated by a change in background color instead (second screenshot below). These two types of focus state seem to use the tintColor of the element.
We were advised that the change in background color does not meet WCAG standards since the contrast ratio between the non-focused state and the light blue focused state is not high enough.
Apart from changing the tintColor, is there any other way to customize the focused appearance of an element? It would be ideal if we could apply a border to the focused element even when it's contained in a focus group, rather than just changing the background color.
I have an application that uses Accessibility APIs to determine if a browser tab with website content is present by checking for the AXWebArea role element. However, some users have reported that the app is not working after a recent update. When I check for permission using the following code, the result is YES, but the app still cannot detect the AXWebArea element in the browser though other elements like AXButton were there:
NSDictionary *options = @{(__bridge id)kAXTrustedCheckOptionPrompt : @YES};
NSLog(@“%d”, AXIsProcessTrustedWithOptions((__bridge CFDictionaryRef)options));
The issue is resolved when the app is removed and re-added to the Accessibility permissions list in the System settings. I would appreciate a detailed explanation of what could be causing this issue.
Our users are using Apple's native Voice Control feature: https://support.apple.com/en-us/HT210417
We want to enhance our accessibility experience by adding some additional voice controlled dialogs that show up specifically when Voice Control is enabled.
It can be determined if other Apple accessibility features are turned on via a check like UIAccessibility.isVoiceOverRunning, however there is no option for Voice Control (note, different than Voice Over).
How can I detect if a user is running Voice Control or not?
I see a lot of crashes on iOS 17 beta regarding some problem of "Text To Speech". Does anybody has a clue why TTS crashes? Anybody else seeing the same problem?
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Subtype: KERN_INVALID_ADDRESS at 0x000000037f729380
Exception Codes: 0x0000000000000001, 0x000000037f729380
VM Region Info: 0x37f729380 is not in any region. Bytes after previous region: 3748828033 Bytes before following region: 52622617728
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
MALLOC_NANO 280000000-2a0000000 [512.0M] rw-/rwx SM=PRV
---> GAP OF 0xd20000000 BYTES
commpage (reserved) fc0000000-1000000000 [ 1.0G] ---/--- SM=NUL ...(unallocated)
Termination Reason: SIGNAL 11 Segmentation fault: 11
Terminating Process: exc handler [36389]
Triggered by Thread: 9
.....
Thread 9 name:
Thread 9 Crashed:
0 libobjc.A.dylib 0x000000019eeff248 objc_retain_x8 + 16
1 AudioToolboxCore 0x00000001b2da9d80 auoop::RenderPipeUser::~RenderPipeUser() + 112 (AUOOPRenderPipePool.mm:400)
2 AudioToolboxCore 0x00000001b2e110b4 -[AUAudioUnit_XPC internalDeallocateRenderResources] + 92 (AUAudioUnit_XPC.mm:904)
3 AVFAudio 0x00000001bfa4cc04 AUInterfaceBaseV3::Uninitialize() + 60 (AUInterface.mm:524)
4 AVFAudio 0x00000001bfa894bc AVAudioEngineGraph::PerformCommand(AUGraphNodeBaseV3&, AVAudioEngineGraph::ENodeCommand, void*, unsigned int) const + 772 (AVAudioEngineGraph.mm:3317)
5 AVFAudio 0x00000001bfa93550 AVAudioEngineGraph::_Uninitialize(NSError**) + 132 (AVAudioEngineGraph.mm:1469)
6 AVFAudio 0x00000001bfa4b50c AVAudioEngineImpl::Stop(NSError**) + 396 (AVAudioEngine.mm:1081)
7 AVFAudio 0x00000001bfa4b094 -[AVAudioEngine stop] + 48 (AVAudioEngine.mm:193)
8 TextToSpeech 0x00000001c70b3c5c __55-[TTSSynthesisProviderAudioEngine renderSpeechRequest:]_block_invoke + 1756 (TTSSynthesisProviderAudioEngine.m:613)
9 libdispatch.dylib 0x00000001ae4b0740 _dispatch_call_block_and_release + 32 (init.c:1519)
10 libdispatch.dylib 0x00000001ae4b2378 _dispatch_client_callout + 20 (object.m:560)
11 libdispatch.dylib 0x00000001ae4b990c _dispatch_lane_serial_drain + 748 (queue.c:3885)
12 libdispatch.dylib 0x00000001ae4ba470 _dispatch_lane_invoke + 432 (queue.c:3976)
13 libdispatch.dylib 0x00000001ae4c5074 _dispatch_root_queue_drain_deferred_wlh + 288 (queue.c:6913)
14 libdispatch.dylib 0x00000001ae4c48e8 _dispatch_workloop_worker_thread + 404 (queue.c:6507)
...
Thread 9 crashed with ARM Thread State (64-bit):
x0: 0x0000000283309360 x1: 0x0000000000000000 x2: 0x0000000000000000 x3: 0x00000002833093c0
x4: 0x00000002833093c0 x5: 0x0000000101737740 x6: 0x0000000000000013 x7: 0x00000000ffffffff
x8: 0x0000000283309360 x9: 0x3c788942d067009a x10: 0x0000000101547000 x11: 0x0000000000000000
x12: 0x00000000000007fb x13: 0x00000000000007fd x14: 0x000000001ee24020 x15: 0x0000000000000020
x16: 0x0000b1037f729360 x17: 0x000000037f729360 x18: 0x0000000000000000 x19: 0x0000000000000000
x20: 0x00000001016a8de8 x21: 0x0000000283e21d00 x22: 0x0000000283b3f1f8 x23: 0x0000000283098000
x24: 0x00000001bfb4fc35 x25: 0x00000001bfb4fc43 x26: 0x000000028033a688 x27: 0x0000000280c93090
x28: 0x0000000000000000 fp: 0x000000016fc86490 lr: 0x00000001b2da9d80
sp: 0x000000016fc863e0 pc: 0x000000019eeff248 cpsr: 0x1000
esr: 0x92000006 (Data Abort) byte read Translation fault