Thursday, September 26, 2013

JustSpeak: Control All Aspects Of Android By Voice

JustSpeak: Controlling Android By Voice

1 JustSpeak: Controlling Android By Voice

JustSpeak is an Android Accessibility Service that enables voice control of your Android device. Once enabled, you can activate on-screen controls, launch installed applications, and trigger other commonly used Android actions using spoken commands.

1.1 Starting JustSpeak

Once installed, you can start JustSpeak under Settings->Accessibility on your Android device — you only need to do this once, i.e., JustSpeak will be automatically restarted when you reboot your phone.

JustSpeak is an Accessibility Service i.e., once started, it uses Acessibility APIs on the platform to augment Android's user interface. JustSpeak augments the Android user interface with voice-input control; other Accessibility Services like TalkBack provide spoken feedback. Note that JustSpeak can be used either by itself, or in conjunction with other platform Accessibility Services such as TalkBack.

1.2 Initiating Voice Commands

Once started, JustSpeak registers itself as an Assistant on your device; you can initiate voice commands by swiping up from the bottom of the screen. Note that this is the same gesture that activates Google Now on devices running the stock version of Android. When using JustSpeak, you can get to Google Now by saying Google. Successful invocation of voice-control produces an auditory icon, indicating that your device is ready to listen. The auditory tone, along with an optional vibration (if available) you hear is your cue to start speaking; JustSpeak displays a visual overlay on the right-edge of the screen to indicate that it is listening. the recognized utterance is displayed visually, — additionally, it is spoken out if TalkBack is active. JustSpeak supports two types of voice control commands as described below.

1.3 Global Voice Commands

JustSpeak supports a set of global commands — these are global in that they are available on any screen. Global commands include:

CommandUtteranceAction
OpenOpen <Installed Application>Launch Application
RecentRecentRecent Applications
Quick SettingsQuick SettingsLaunch Quick Settings
Toggle WiFiSwitch WiFi (On/Off)Toggle WiFi
Toggle BluetoothSwitch Bluetooth (On/Off)Toggle Bluetooth
Toggle TetheringSwitch Tethering (On/Off)Toggle Tethering
HomeGo HomeReturn to home screen
BackGo BackReturn to previous screen
NotificationsOpen NotificationsPull down notifications shade
Note that voice commands are flexible in that you can use a number of synonyms for verbs such as open e.g., launch. In addition, voice control commands can be formulated as full sentences, e.g., "Please open GMail".

List Of Synonyms For Open:

  • open
  • go to
  • show
  • display

In addition, JustSpeak provides the following spoken aliases as a means of triggering commonly used applications:

UtteranceAction
BrowserLaunch default Web browser
WebLaunch default Web Browser
OK Google NowLaunch Google Now
SearchLaunch Voice Search
Voice SearchLaunch Voice Search

1.4 Local Voice Commands

In addition to the global voice commands that are available on any screen, JustSpeak lets you activate on-screen controls in a variety of ways.

CommandUtteranceAction
ActivateClick <control name>Activate control by its on-screen name
ToggleTurn on/off <toggle>Toggle on/off switch
ScrollScroll Up/DownScroll e.g., Lists.
List Of Synonym For Activate:
  • touch
  • click on
  • press
  • activate
  • open
  • push
  • tap

1.5 Chaining Commands

Note that you can issue a sequence of commands via a single utterance by using simple connectives such as then or and when speaking; as an example, you can say:

Quick Settings then  WiFi

to open "Quick Settings" and then click the "WiFi" item that appears within quick settings. This is an example of chaining together a global command followed by a local command that becomes available after the global command has executed successfully. Similarly, you can say:

Open GMail, then compose

to launch GMail, and immediately start composing a new message.

1.6 Alternative Means Of Triggering JustSpeak

We are continuing to experiment with alternative ways of initiating voice control. Toward this end, JustSpeak enables you to configure an NFC tag that can then be tapped to initiate voice control; see JustSpeak->Settings for configuring an NFC tag. On Android 4.3 and above, you can also trigger JustSpeak by long-pressing both volume buttons at the same time.

Date: 2013-08-06 Tue

Author: T.V Raman

Org version 7.9.3f with Emacs version 24

Validate XHTML 1.0

Wednesday, July 25, 2012

Jelly Bean: Accessibility Gestures Explained

Jelly Bean: Accessibility Gestures Explained

1 Jelly Bean: Accessibility Gestures Explained

This article details accessibility gestures in Jelly Bean and is a follow-up to Jelly Bean Accessibility Explained. It gives a conceptual overview of the 16 possible gestures and describes how they are used. The interaction behavior described here holds for all aspects of the Android user interface, including interactive Web pages within Chrome and the Android Web Browser.

1.1 Conceptual Overview Of The Gestures

Placing a finger on the screen speaks the item under the finger by first placing Accessibility Focus on that item. Moving the finger triggers touch exploration which moves Accessibility Focus.

To generate any of the Accessibility Gestures discussed below, one moves the finger much faster — how much faster is something we will tune over time, and if needed, make user customizable.

To remember the gestures, think of the four directions, Up, Down, Left and Right. In addition to these four basic navigational gestures, we defined an additional 12 gestures by picking pairwise combinations of these directional flicks, e.g., Left then Down — this gives a total of 16 possible gestures. In what follows, left then down means swipe left, and continue with a down flick. Note that in performing these additional gestures, speed matters the most. As an example, it is not essential that you make a perfect Capital L when performing the Down then Right gesture for instance; speed throughout the gesture, as well as ensuring that the finger moves some distance in each direction is key to avoid these being misinterpreted as basic navigational gestures.

1.2 Accessibility Focus And Accessibility Gestures

Accessibility Focus is moved using the four basic directional gestures. For now we have aliased Left with Up, and Down with Right; i.e., both Left and Up move to the previous item, whereas Down and Right move to the next item. Note that this is not the same as moving with a physical D-Pad or keyboard on Android; the Android platform moves System Focus in response to the D-Pad. Thus, moving with a D-Pad or trackball moves you through the various interactive controls on the screen; moving Accessibility Focus via the Accessibility Gestures moves you through everything on the screen.

1.3 Accessibility Gestures For Common Actions

In addition to the basic navigation describe above, we define the following gestures for common actions:

Navigation Granularity
You can increase or decrease navigation granularity by rapidly stroking Up then Down or Down then Up.
Scrolling Lists
You can scroll a list forward by rapidly stroking Right then Left; the reverse, i.e., Left then Right scrolls the list backward by a screenful.

1.4 User Configurable Gestures

Gestures Down then Left, Up then Left, Down then Right and Up then Right are user configurable; their default assignments are shown below.

Back
Gesture Down then Left is the same as pressing the Back button.
Home
Up then Left has the same effect as pressing the Home button.
Status Bar
Gesture Up then Right opens Status Notifications.
Recent
Down then Right has the same effect as pressing the Recent Applications button.

1.5 Summary

Gestures for manipulating and working with Accessibility Focus are an evolving part of the Android Accessibility; we will continue to refine these based on user experience. At this point, you are probably saying:

But wait, you said 16 gestures, but only told us the meanings of 12 of them

You are correct — we have left ourselves some gestures to use for future features.

Tuesday, July 24, 2012

Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

1 Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

We announced a number of accessibility enhancements in Android Jelly Bean — see our Google IO 2012 announcements and our Android Accessibility talk from I/O 2012. This article gives a user-centric overview of the Jelly Bean interaction model as enabled by touch exploration and navigational gestures. Note that as with every release, Android Accesssibility continues to evolve, and so as before, what we have today is by no means the final word.

1.1 High-Level Concepts

First, here's some shared vocabulary to ensure that we're all talking of the same thing when explaining Jelly Bean access:

Random Access
Enable user to reach any part of the on-screen UI with equal ease. We enabled this as of ICS with touch exploration.
Deterministic Access
Enable user to reliably land on a desired item on the screen. We enable this in Jelly Bean with linear navigation.
Accessibility Focus
The item that the user most recently interacted with — either via touch exploaration or linear navigation receives accessibility focus.
Activation
User can activate item having accessibility focus by double-tapping anywhere on the screen.

1.2 Sample Interaction Scenarios We Enable

The combination of random access via touch exploration, backed up by linear navigation starting from the point the user just explored enables users to:

  • Touch explore an application to understand its screen layout,
  • Use muscle memory to quickly touch parts of the display to access familiar application screens,
  • Use linear navigation to reach the desired item when muscle memory is wrong by a small amount.

As an example, when using the Google Play Store, I can use muscle memory with touch exploration to find the Search button in the top action bar. Having found an application to install, I can once again use muscle memory to roughly touch in the vicinity of the Install button; If what I touch is not the Install button, I can typically find it with one or two linear navigation steps. Having found the Install button, I can double-tap anywhere on the screen.

The same use case in ICS where we lacked Accessibility Focus and linear navigation would have forced me to use touch exploration exclusively. In instances where muscle memory worked perfectly, this form of interaction was highly effective; but in our experience, it also tended to lead to breakdowns and consequent user frustration in instances where users almost found the control they were looking for but missed by a small amount.

Having introduced accessibility focus and linear navigation in Jelly Bean, we decided to eliminate the ICS requirement that the user tap on or near a control to activate it — we now enable users to activate the item with accessibility focus by tapping anywhere on the screen. To eliminate spurious taps --- especially on tablets, we made this a double-tap rather than a single tap. Note: based on user experience, we may optionally bring back single tap at some point in the future as an end-user customization.

1.3 Summary

Android Accessibility continues to move forward with Jelly Bean, and will continue to evolve rapidly along with the platform. Please use the Eyes-Free Google Group to provide constructive feedback on what works or doesn't work for you --- what is most effective is to objectively describe a given use case and your particular experience.

Friday, June 29, 2012

What's New In Google Accessibility From Google I/O 2012

Google IO 2012: What's New With Google Access

1 Google IO 2012: What's New From Google Access

We showcased a number of exciting advances in accessibility on Android and Chrome during IO 2012. With these advances, blind and low-vision users can leverage Google applications and services on Android and Chrome to collaborate effectively with their peers and to obtain on-the-go access. Advances include out-of-the-box access on Android (JellyBean), a new set of gestures that enable fluent interaction on touch-screen devices, Braille support on Android, an extension framework for ChromeVox, and a new, high-quality voice for use with Web applications on Chrome.

1.1 Enhanced Android Accessibility In JellyBean:

  • Accessibility on Android can be enabled by long-pressing with two fingers (4 seconds) the setup screen to enable out-of-the-box access for blind users.
  • Touch exploration has been enhanced with simple gestures that enable users navigate on-screen contents.
  • JellyBean provides a set of Accessibility Actions that can be called from any AccessibilityService such as TalkBack; it also provides early support for Braille displays.
  • Touch exploration and gesture navigation both set AccessibilityFocus รข€”

double-tapping anywhere on the screen activates the item with AccessibilityFocus .

  • TalkBack now has a sister service BrailleBack for providing Braille support on Android.
  • Chrome on Android is now accessible and supports the latest in Web Access standards.

With these enhancements in Android access, blind users can use a combination of touch exploration and navigational gestures to access any part of the Android user interface.

As an example, I typically use the Android Play Store by touching the screen around the area where I expect a specific control; quick flicks of the finger then immediately get me to the item I want. With these touch gestures in place, I now use touch exploration to learn the layout of an application; with applications that I use often, I use a combination of muscle memory and gesture navigation for rapid task completion.

1.2 Chrome OS On Chrome Books And Chrome Box

Chrome OS comes with ChromeVox pre-configured — ChromeVox is our Web Accessibility solution for blind users. With the new high-quality voice that is being released on the Chrome Webstore, ChromeVox now provides smooth spoken feedback in Chrome on all desktop environments. Finally, we announced a flexible extension framework that enables Web developers leverage ChromeVox from within and outside of their own Web applications to provide rich, contextual spoken feedback.

1.3 Developer Tools For Ensuring Accessibility

To help developers better leverage Web Accessibility, we are releasing a new Accessibility Audit tool that enables Web developers detect and fix commonly occuring accessibility errors. This tool has been integrated into Chrome's Developer Tools and helps Web developers ensure accessibility while working within their normal workflow.

1.4 Accessibility Related Presentations At Google I/O 2012

Catch these on Youtube in the next week if you weren't able to attend I/O this week.

  • Android Accessibility (T. V. Raman,Peter Lundblad, Alan Viverette and Charles Chen).
  • Advanced Web Accessibility (Rachel Shearer, Dominic Mazzoni and Charles Chen).
  • What's New In JellyBean: Android Team.
  • JellyBean announcement in the Wednesday keynote.

Date: 2012-06-21 Thu

Author: T.V Raman

Org version 7.8.11 with Emacs version 24

Validate XHTML 1.0

Wednesday, August 10, 2011

Accessible GMail On Android ---- Eyes-Free Email On The Go!

Accessible GMail On Android — Eyes-Free Email On The Go

1 Accessible GMail On Android — Eyes-Free Email On The Go

I've been using Android as my primary smart phone since late 2008, and the level of email access I've had on Android in the past has always been a source of frustration. About a year ago, I first started accessing my email on Android with K9-Mail — that helped me bridge some of the accessibility gaps on the platform.

Over the last few months, our friends over in GMail Mobile have been adding accessibility support to the GMail client on Android. What is truly exciting is that this support is being added to existing releases of Android including Froyo (Android 2.2) and GingerBread (Android 2.3). This means that GMail on Android is now accessible on existing devices — get the update from Market and give it a spin.

1.1 Typical Usage Pattern

Here is my typical usage pattern when accessing my corporate email at Google. Note that the volume of email I receive on this account is extremely high, and includes many mailing lists that I typically do not read while on a mobile device. To limit how much email I download to the mobile device, and to ensure that I attend to the most pressing email messages while on the go I do the following:

  • I have defined a GMail filter that assigns label to-mobile to messages I want to access when on the go.
  • Typically, this includes email addressed directly to me, and other priority items.
  • I launch GMail to open to this label.
  • I quickly skim through the list of displayed messages to here the subject and a quick overview of the message.
  • If I decide to read the complete message, I select that message via the trackball on my Nexus One to hear the message in its entirety.
  • And finding an email thread I am looking for is just one click away — press the search button, and use up/down to navigate your search history.

See our help center documentation for additional details.

Author: T.V Raman <raman@google.com>

Date: 2011-08-10 Wed

HTML generated by org-mode 6.30c in emacs 23

Monday, May 16, 2011

Leveraging Android Access From Google IO 2011

You can watch our Google IO 2011 on Levarging Android Access APIs. The main take-aways from the talk:

  • Android Access is easy --- the framework does most of the heavy-lifting.
  • Implementing Android Access does not mean you take a performance hit.
  • Accessibility is really about expanding the reach of your application.

Implementing accessibility within your application and thereby ensuring that it is usable in a wide variety of end-user scenarios will benefit your application --- both in terms of the number of users you gain, as well as how often your users use your application.

Monday, March 21, 2011

TalkBack Refreshed: Accessible On-Screen Keyboard And More ...

Android Access: TalkBack Refreshed

1 Android Access: TalkBack Refreshed

The latest enhancements to TalkBack now brings Android Accessibility to devices without a physical keyboard. Many of these enhancements also improve the overall TalkBack experience on all devices.

1.1 Highlights

  • New TalkBack Keyboard.
  • On-screen talking keyboard enables text entry via the touch screen.
  • Text review provides spoken feedback when moving the cursor by character, word, sentence, or paragraph.
  • Virtual D-Pad for navigating the Android user interface.
  • Global TalkBack commands enable one-click access to oft-used commands.

1.2 TalkBack Keyboard

The TalkBack Keyboard is an Accessible Input Method (Accessible IME) that when activated enables you to enter and review text via the touch screen. To use this feature, you need to first activate the TalkBack keyboard via the Language and Keyboard option in the Settings menu. Next, customize the TalkBack Keyboard to taste via the TalkBack Keyboard Settings option --- here, you can customize additional features including auditory feedback as you type. Finally, open your favorite editing application, long-press on an edit field, and select TalkBack keyboard as your default IME. Note that you need do this only once; once the TalkBack keyboard has been made the default, it persists across reboots.

1.3 Entering Text On The Touch Screen

TalkBack keyboard is an on-screen keyboard that supports touch exploration along with synchronized spoken and auditory feedback. This means you can now enter text when using devices that don't sport a physical keyboard.

But wait, there's more here than meets the finger at first touch. Once you have activated the TalkBack Keyboard, you can switch the keyboard among three states by long-pressing the volume up/down buttons:

Hidden
The TalkBack keyboard is not displayed.
Navigating
You get access to an on-screen virtual D-Pad, along with Back, Home, Search, and Menu buttons.
Typing
An on-screen qwerty keyboard.

My preferred means of using the keyboard is to turn on auditory feedback from within TalkBack Keyboard Settings, as well as having SoundBack active. In this mode, you hear keys as you explore the keyboard along with an auditory icon; picking up your finger types the last key you explored. Typing produces a distinctive key-click.

The on-screen keyboard occupies the bottom 1/3 of your screen. While entering text, explore and find the top row, then move above it to hear what you have typed so far.

1.4 Reviewing Text By Character, Word, Sentence Or Paragraph

You can now navigate and review text by character, word, sentence or paragraph. Use a two-finger tap to move forward through these navigation levels; a two-finger double tap moves in the reverse direction. Once you have selected your preferred mode of navigation, you can use Up/Down on the physical track-ball/D-Pad, or alternatively, flick up or down on the virtual D-Pad to move forward or backward through the text being reviewed.

Note that text review works when the TalkBack keyboard is in either/navigating/ or typing mode; personally, I find it less error-prone on keyboard-less devices to first switch to navigating mode when reviewing text, since it is easy to inadvertently enter spurious text otherwise.

1.5 Using The On-Screen Virtual D-Pad

Placing the TalkBack keyboard in navigating mode provides an on-screen virtual D-Pad --- this is especially useful on devices that do not have a physical D-Pad or track-ball on the front of the device. When active, the virtual D-Pad occupies the bottom one-third of the screen, and fast-flicks in that area has the same effect as moving with a D-Pad or track-ball. Tapping anywhere within the virtual D-Pad is the same as clicking with the track-ball.

The corners of the virtual D-Pad also provides Back, Home, Search and Menu buttons --- these are especially useful on devices that lack explicit physical or capacitive buttons for these common Android actions. You can explore the virtual D-pad by moving your finger around the D-Pad area; crossing the top-edge of this area provides haptic and auditory feedback that can be used as an orientation aid in finding the virtual buttons on the corners.

1.6 Global Commands

In addition, selecting the TalkBack Keyboard as your default input method enables a set of global commands that can be accessed from your physical keyboard --- eventually, we will make these available via the soft keyboard as well. Here are a list of the current commands:

CommandDescriptionKey
BatterySpeaks the current battery levelmenu + B
TimeSpeaks the current date and timemenu + T
ConnectivitySpeaks the connectivity state of each connection: WiFi, 3G, etcmenu + O
RepeatRepeats the last TalkBack utterancemenu + R
SpellSpells the last TalkBack utterancemenu + S

These shortcuts are listed in the Accessibility Preferences application where they can be edited. You can choose between menu and search for the modifier, and any letter on the keyboard for the letter.

1.7 Summary

All of these features work on Android 2.2 and above. In addition, TalkBack makes WebView accessible in Honeycomb --- look for a separate announcement about accessibility enhancements that are exclusive to the Honeycomb release in the coming weeks.

Author: T.V Raman

Date: 2011-03-16 Wed

HTML generated by org-mode 7.4 in emacs 24