Wednesday, July 25, 2012

Jelly Bean: Accessibility Gestures Explained

Jelly Bean: Accessibility Gestures Explained

1 Jelly Bean: Accessibility Gestures Explained

This article details accessibility gestures in Jelly Bean and is a follow-up to Jelly Bean Accessibility Explained. It gives a conceptual overview of the 16 possible gestures and describes how they are used. The interaction behavior described here holds for all aspects of the Android user interface, including interactive Web pages within Chrome and the Android Web Browser.

1.1 Conceptual Overview Of The Gestures

Placing a finger on the screen speaks the item under the finger by first placing Accessibility Focus on that item. Moving the finger triggers touch exploration which moves Accessibility Focus.

To generate any of the Accessibility Gestures discussed below, one moves the finger much faster — how much faster is something we will tune over time, and if needed, make user customizable.

To remember the gestures, think of the four directions, Up, Down, Left and Right. In addition to these four basic navigational gestures, we defined an additional 12 gestures by picking pairwise combinations of these directional flicks, e.g., Left then Down — this gives a total of 16 possible gestures. In what follows, left then down means swipe left, and continue with a down flick. Note that in performing these additional gestures, speed matters the most. As an example, it is not essential that you make a perfect Capital L when performing the Down then Right gesture for instance; speed throughout the gesture, as well as ensuring that the finger moves some distance in each direction is key to avoid these being misinterpreted as basic navigational gestures.

1.2 Accessibility Focus And Accessibility Gestures

Accessibility Focus is moved using the four basic directional gestures. For now we have aliased Left with Up, and Down with Right; i.e., both Left and Up move to the previous item, whereas Down and Right move to the next item. Note that this is not the same as moving with a physical D-Pad or keyboard on Android; the Android platform moves System Focus in response to the D-Pad. Thus, moving with a D-Pad or trackball moves you through the various interactive controls on the screen; moving Accessibility Focus via the Accessibility Gestures moves you through everything on the screen.

1.3 Accessibility Gestures For Common Actions

In addition to the basic navigation describe above, we define the following gestures for common actions:

Navigation Granularity
You can increase or decrease navigation granularity by rapidly stroking Up then Down or Down then Up.
Scrolling Lists
You can scroll a list forward by rapidly stroking Right then Left; the reverse, i.e., Left then Right scrolls the list backward by a screenful.

1.4 User Configurable Gestures

Gestures Down then Left, Up then Left, Down then Right and Up then Right are user configurable; their default assignments are shown below.

Gesture Down then Left is the same as pressing the Back button.
Up then Left has the same effect as pressing the Home button.
Status Bar
Gesture Up then Right opens Status Notifications.
Down then Right has the same effect as pressing the Recent Applications button.

1.5 Summary

Gestures for manipulating and working with Accessibility Focus are an evolving part of the Android Accessibility; we will continue to refine these based on user experience. At this point, you are probably saying:

But wait, you said 16 gestures, but only told us the meanings of 12 of them

You are correct — we have left ourselves some gestures to use for future features.

Tuesday, July 24, 2012

Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

1 Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

We announced a number of accessibility enhancements in Android Jelly Bean — see our Google IO 2012 announcements and our Android Accessibility talk from I/O 2012. This article gives a user-centric overview of the Jelly Bean interaction model as enabled by touch exploration and navigational gestures. Note that as with every release, Android Accesssibility continues to evolve, and so as before, what we have today is by no means the final word.

1.1 High-Level Concepts

First, here's some shared vocabulary to ensure that we're all talking of the same thing when explaining Jelly Bean access:

Random Access
Enable user to reach any part of the on-screen UI with equal ease. We enabled this as of ICS with touch exploration.
Deterministic Access
Enable user to reliably land on a desired item on the screen. We enable this in Jelly Bean with linear navigation.
Accessibility Focus
The item that the user most recently interacted with — either via touch exploaration or linear navigation receives accessibility focus.
User can activate item having accessibility focus by double-tapping anywhere on the screen.

1.2 Sample Interaction Scenarios We Enable

The combination of random access via touch exploration, backed up by linear navigation starting from the point the user just explored enables users to:

  • Touch explore an application to understand its screen layout,
  • Use muscle memory to quickly touch parts of the display to access familiar application screens,
  • Use linear navigation to reach the desired item when muscle memory is wrong by a small amount.

As an example, when using the Google Play Store, I can use muscle memory with touch exploration to find the Search button in the top action bar. Having found an application to install, I can once again use muscle memory to roughly touch in the vicinity of the Install button; If what I touch is not the Install button, I can typically find it with one or two linear navigation steps. Having found the Install button, I can double-tap anywhere on the screen.

The same use case in ICS where we lacked Accessibility Focus and linear navigation would have forced me to use touch exploration exclusively. In instances where muscle memory worked perfectly, this form of interaction was highly effective; but in our experience, it also tended to lead to breakdowns and consequent user frustration in instances where users almost found the control they were looking for but missed by a small amount.

Having introduced accessibility focus and linear navigation in Jelly Bean, we decided to eliminate the ICS requirement that the user tap on or near a control to activate it — we now enable users to activate the item with accessibility focus by tapping anywhere on the screen. To eliminate spurious taps --- especially on tablets, we made this a double-tap rather than a single tap. Note: based on user experience, we may optionally bring back single tap at some point in the future as an end-user customization.

1.3 Summary

Android Accessibility continues to move forward with Jelly Bean, and will continue to evolve rapidly along with the platform. Please use the Eyes-Free Google Group to provide constructive feedback on what works or doesn't work for you --- what is most effective is to objectively describe a given use case and your particular experience.

Friday, June 29, 2012

What's New In Google Accessibility From Google I/O 2012

Google IO 2012: What's New With Google Access

1 Google IO 2012: What's New From Google Access

We showcased a number of exciting advances in accessibility on Android and Chrome during IO 2012. With these advances, blind and low-vision users can leverage Google applications and services on Android and Chrome to collaborate effectively with their peers and to obtain on-the-go access. Advances include out-of-the-box access on Android (JellyBean), a new set of gestures that enable fluent interaction on touch-screen devices, Braille support on Android, an extension framework for ChromeVox, and a new, high-quality voice for use with Web applications on Chrome.

1.1 Enhanced Android Accessibility In JellyBean:

  • Accessibility on Android can be enabled by long-pressing with two fingers (4 seconds) the setup screen to enable out-of-the-box access for blind users.
  • Touch exploration has been enhanced with simple gestures that enable users navigate on-screen contents.
  • JellyBean provides a set of Accessibility Actions that can be called from any AccessibilityService such as TalkBack; it also provides early support for Braille displays.
  • Touch exploration and gesture navigation both set AccessibilityFocus รข€”

double-tapping anywhere on the screen activates the item with AccessibilityFocus .

  • TalkBack now has a sister service BrailleBack for providing Braille support on Android.
  • Chrome on Android is now accessible and supports the latest in Web Access standards.

With these enhancements in Android access, blind users can use a combination of touch exploration and navigational gestures to access any part of the Android user interface.

As an example, I typically use the Android Play Store by touching the screen around the area where I expect a specific control; quick flicks of the finger then immediately get me to the item I want. With these touch gestures in place, I now use touch exploration to learn the layout of an application; with applications that I use often, I use a combination of muscle memory and gesture navigation for rapid task completion.

1.2 Chrome OS On Chrome Books And Chrome Box

Chrome OS comes with ChromeVox pre-configured — ChromeVox is our Web Accessibility solution for blind users. With the new high-quality voice that is being released on the Chrome Webstore, ChromeVox now provides smooth spoken feedback in Chrome on all desktop environments. Finally, we announced a flexible extension framework that enables Web developers leverage ChromeVox from within and outside of their own Web applications to provide rich, contextual spoken feedback.

1.3 Developer Tools For Ensuring Accessibility

To help developers better leverage Web Accessibility, we are releasing a new Accessibility Audit tool that enables Web developers detect and fix commonly occuring accessibility errors. This tool has been integrated into Chrome's Developer Tools and helps Web developers ensure accessibility while working within their normal workflow.

1.4 Accessibility Related Presentations At Google I/O 2012

Catch these on Youtube in the next week if you weren't able to attend I/O this week.

  • Android Accessibility (T. V. Raman,Peter Lundblad, Alan Viverette and Charles Chen).
  • Advanced Web Accessibility (Rachel Shearer, Dominic Mazzoni and Charles Chen).
  • What's New In JellyBean: Android Team.
  • JellyBean announcement in the Wednesday keynote.

Date: 2012-06-21 Thu

Author: T.V Raman

Org version 7.8.11 with Emacs version 24

Validate XHTML 1.0