Accessibility Custom Actions Aren't Announced in Swift

Accessibility Custom Actions aren't announced in Swift

I thought that by adding the custom actions to my object, it would announce "Actions available" automatically, or "Swipe up to select..."

And you're right, it should have read out these announcements but this weird behavior has been introduced making itself scarce in iOS 13.

"iOS 13 introduced a new custom actions behavior: the "actions available" announcement isn't always present anymore.

It was previously offered to every element containing custom actions but, now, it will occur when you navigate to another element that contains a different set of actions.

The purpose is to prevent repetitive announcements on elements where the same actions are present as the previous element"
⇒ source

Nothing can be done to undo this behavior that is all but efficient for the VoiceOver users.

Moreover, if you do the same in iOS 12, it works perfectly: custom actions are present and announced every time they are implemented for an accessible element.

Unfortunately, in iOS 13, this is a new behavior explained by the Apple support itself (accessibility@apple.com) but with no public presentation (WWDC, Apple website...) and that's insane for such a huge modification that's more a problem than a significant evolution for sighted impaired people using VoiceOver./p>

No solutions are suggested by the support and we'll have to deal with this unless a rollback is engaged in the next WWDC... light a candle. /p>

⚠️ ⬛️◼️▪️ EDIT ▪️◼️⬛️ ⚠️ (2020/03/19)

I wrote a Developer Technical Support Incident (no 730330678) for this problem and here's the answer from Apple:

There is no published information.

We intentionally made changes in iOS 13, so that we would only speak actions available if the list of actions had changed from the previous element you were on, or you moved to a different container.

You can do a flash manipulation of the list or bounce quickly between containers but this should just work without code changes.

Unfortunately in currently shipping systems it is a bug.

We are improving our documentation as well so please stay tuned.

I submitted a bug report entitled VoiceOver doesn't read out the custom actions anymore with the reference FB7426771.

Now, you know why Accessibility Custom Actions aren't announced in Swift... only sometimes in iOS 13. /p>

iOS VoiceOver functionality changes with Bundle Identifier

I have spent days betting my head on this one. Purchased a new device and had the issue persist without transfering my apps or settings.

I got an answer from Apple Frameworks Engineer although they said the Screen Recognition Setting is saved at the system level. My guess is Screen Recognition is saved in iCloud or some other synchronized location as the setting persisted on a new device installation. Here is the response from Apple in case anyone runs into this issue.
https://developer.apple.com/forums/thread/698009

It sounds to me like you may have inadvertently enabled Screen Recognition for your app. We store this setting by bundle identifier on the system. You can use the rotor (rotate 2 fingers on the screen like you are rotating a dial) to get to Screen Recognition, and then swipe up or down to toggle it on or off. This feature uses machine learning models to attempt to make your app accessible rather than relying on the view hierarchy, which would be why the properties you are setting aren't being respected. If you toggle this off, it'll go back to reading from the view hierarchy.

This setting is difficult to turn on or off, is only available in app when VoiceOver is on. Despite being an Accessibility Feature it does not announce how to adjust the feature there is simply a dial which disappears quickly after a setting is selected. There is no indication that a user should swipe up or down, swiping up or down must be done within one second after you pickup your 2 fingers used to find the settings on a dial. The users fingers are likely in an award position from turning the dial, so it may take a few attempts to change the setting. Anyway I hope this saves someone days of debugging.

How do I add `accessibilityLabel` to `UIAlertView` buttons?

The only way you can do this is finding the UIButtons within the UIAlertView subviews:

for (id button in self.alertView.subviews){
if ([button isKindOfClass:[UIButton class]]){
((UIButton *)button).accessibilityLabel = @"your custom text";
}
}

However this is the only way to do it because there is not a public API to access these UIButtons and this is because Apple doesn't want you to access them. Accessing internal views of the UIAlertView class is something which Apple does not allow and it is likely that will make your app rejected during the App Store review process.

If you really need to have UIButtons with a custom accessibilityLabel you should look into designing a custom alert view instead of using the Apple UIAlertView class.



Related Topics



Leave a reply



Submit