How to Use Tap Gesture in Accessibility in Swift

Is it possible to assign an accessibility action to a UILabel?

In your UILabel subclass, override accessibilityActivate() and implement whatever double-tapping should do:

override func accessibilityActivate() -> Bool {
// do things...
return true
}

If the action can fail, return false in those instances.

Registering UIGestureRecognizer actions in voice over

Figured out how this works, and I think it's worth sharing this:

After looking around at UIView classes, as well as UIAccessibilityTraits, all UIViews and their subclasses have a bitmask using various UIAccessibilityTraits which can be used to:
-Designate the standard behavior of the UIView or any class adopting the UIAccessibility protocol.
-Allow various configurations such as refresh speed, enabling slider type behavior etc.

Here is a link to all the available traits:
https://developer.apple.com/reference/uikit/uiaccessibility/accessibility_traits

For my specific case, I used UIAccessibilityTraitAllowsDirectInteraction and UIAccessibilityTraitUpdatesFrequently. All UIGestureRecognizers then are properly registered, whether swipe, taps, pinch, etc.

They need to be specified as a bitmask, so if you need an element to have these properties, type in:

myView.UIAccessibilityTraits = UIAccessibilityTraitAllowsDirectInteraction | UIAccessibilityTraitUpdatesFrequently

Compiled, ran the app and I got the view telling me what it is, and swiping registers properly without that "bonk" sound you get when something is wrong.

Hope this helps those who also wondered how to get it to work, however this might also conflict slightly with custom views depending on how you want either sighted or visually impaired users to experience the UI. However, I find this way more elegant than creating a whole set of UIElements just to accomodate voice over usage as the interactive method stays the same, and there's no need to write code to hint or explain what to do.

How can I override VoiceOver gestures in ios?

VoiceOver's direct interaction model supports gesture recognizers. What you're observing is a conflict with the map's gesture handling. Given the complexity of the map view and its touch handling, I'd encourage taking one of two alternative approaches. In both cases, you'll likely want to overlay a transparent UIView atop the map view.

  1. Attach any gesture recognizers to this custom view. Users may trigger shortcuts via direct interaction. You may want to condition this on VoiceOver running.
  2. Sidestep direct interaction entirely and implement your shortcuts as custom actions on the map or overlay view. This will likely benefit users of other accessibility features, not just VoiceOver.

Handle a tap from VoiceOver on a custom view

So... This was a Xamarin problem after all.

To make sure VoiceOver works with custom controls, you have to conform to the UIAccessibilityAction informal protocol. In my case it was accessibilityActivate().

But since I am using Xamarin, I needed to export the method, to make sure that the runtime could see that I implemented a method from the informal protocol.

[Export("accessibilityActivate")]
public bool AccessiblityActivate() {
Console.WriteLine("It works!");
return true;
}

`UIPanGestureRecognizer` not accessible to users who are using VoiceOver in iOS

VoiceOver users can perform a pan by prefixing it with the "pass-through" gesture (one-finger double-tap and hold before continuing with the gesture). You may want to offer an alternative method to access the control. One approach might be to add and conform to the UIAccessibilityTraitAdjustable trait.



Related Topics



Leave a reply



Submit