Friday, October 8, 2010

Walking About With A Talking Android

Walking About With A Talking Android

1 Walking About With a Talking Android

I have long relied on spoken directions from Google Maps on the desktop. As I access more and more of my online world through my Android phone, Google's recent announcement of GMM4.5 enhanced with walking directions means that I now have superior functionality to what I have enjoyed at my desk --- but now with the added benefit of having it all in my pocket!

Inclusion of step-by-step walking directions on Android now allows me to specify a destination on my TalkBack enabledeyes-free Android device, and have these spoken to me as I walk. But wait, there's more!

We're launching a new member of our Eyes-Free family of programs for Android --- WalkyTalky that goes hand-in-hand with spoken walking directions from Google Maps to better navigate the physical world. In addition,application Intersection Explorer allows me to explore the layout of streets using touch before venturing out with WalkyTalky.

1.1 WalkyTalky

WalkyTalky is an Android application that speaks the address of nearby locations as you pass them. It also provides more direct access to the walking directions component of Google Maps. With WalkyTalky installed, you can:

  • Launch WalkyTalky to specify a destination,
  • Either specify the destination by address, or pick from favorites or recently visited locations,
  • And in addition to spoken walking directions,
  • Hear street addresses as you walk by.

These spoken updates, in conjunction with the walking directions that are spoken by Google Maps help me navigate the physical world as efficiently as I navigate the Internet.

1.2 Intersection Explorer

Often, I like exploring a neighborhood to learn the layout of the streets before actually venturing out with my trusty companion,Hubbell Labrador, and this is where Intersection Explorer comes into its own. Using this application, I can explore any neighborhood on Google Maps via touch exploration.

1.2.1 How It Works

  • Intersection Explorer starts off at the user's current location.
  • One can change the start position by entering an address, to do this, press menu and click on new location.
  • Once the map has loaded, touching the screen speaks the streets at the nearest intersection.
  • Moving one's finger along a compass direction, and then tracing a circle speaks each street at that intersection along with the associated compass direction.
  • Presence of streets is cued by a slight vibration as one traces the circle.
  • Lifting up the finger when on a street moves in that direction to the next intersection, speaks the distance moved, and finally speaks the newly arrived-at intersection.

1.3 Summary

Together, Intersection Explorer and WalkyTalky, in conjunction with Walking Directions from Google Maps brings a new level ofaccess to my physical world. I use these tools in conjunction with other Maps-based applications such as the Places Directory on Android --- this is another application from the Google Maps team that works fluently with TalkBack on Android to help me find nearby attractions or other locations of interest.

So next time you take your trusty Android out for a walk, make sure to give these new tools a spin --- you can report back on your experience via our Eyes-Free Group.

Applications WalkyTalky and Intersection Explorer can be downloaded from the Android Market.Share And Enjoy, and as usual, remember, The Best Is Yet To Come!

Author: T.V Raman

Date: 2010-09-09 Thu

HTML generated by org-mode 7.01 in emacs 24




QR Code for WalkyTalky:
QR code for WalkyTalky


QR Code for Intersection Explorer:
QR code for Intersection Explorer

Thursday, September 9, 2010

TalkBack, Eyes-Free Shell Refreshed --- Now With End-User Settings

$

We are pushing out a series of updates via Android Market for TalkBack and the Eyes-Free Shell. Here is a brief overview of end-user visible changes.

Accessibility Preferences

Going by the principle of things should just work as expected , we have long resisted giving in to having a complex set of user preference settings for TalkBack and friends --- in my experience, if you introduce such a settings menu early on, we as software engineers tend to punt on all complex decisions by turning each question into a complex user-facing dialog. That said, it is now time to gradually introduce end-user settings for some aspects of the various accessibility tools.

Accessibility Preferences

Welcome new application AccessibilityPreferences to Android. What this application does:

  • From an end-user perspective, it provides a single place where you will find preference settings corresponding to each accessibility tool you have installed on your phone.
  • For developers of accessibility tools, it provides a simple means of registering a custom program for managing end-user preferences for that tool.

TalkBack installs its user preferences under this tool. You can tweak a number of settings that affect TalkBack behavior including:

  • Control whether TalkBack speaks when the screen is off --- useful to silence status messages when you have the phone turned off.
  • Control whether TalkBack speaks when ringer volume is set to 0, i.e., phone is in silent mode.
  • Control whether the proximity sensor is used to shut off speech.

Over time, we'll add more settings here as appropriate --- but expect us to be conservative with respect to how many settings show up.

Updates To The Eyes-Free Shell

Here is a summary of updates to the Eyes-Free Shell:

  • Changes the proximity sensor logic so that it is only active when the shell is active; this should be more battery efficient
  • Fixes a race condition bug that can trigger when the shell is being exited as an application is being installed/removed

TalkBack

Here is a summary of changes to TalkBack:

  • TalkBack now includes application-specific speech strategies for some popular applications. This provides context-sensitive spoken feedback.
  • Applications that have such speech strategies defined include Facebook, Stitcher and GoogleVoice amongst others.
  • Implements a settings screen that can be used with Accessibility Preferences
  • Available settings:
       
    1. Ringer Volume (Speak at all ringer volumes, No speech in Silent Mode, No speech in Vibrate and Silent Mode)
    2.  
    3. Screen Status (Allow speech when screen is off, No speech when screen is off)
    4.  
    5. Speak Caller ID (checked/not checked)
    6.  
    7. Proximity Sensor (checked/not checked)

In addition, TalkBack introduces the ability to add application-specific plugins --- expect to see more advancement here in future releases.

AccessibilityPreferences Hints For Developers

If you're a developer of an AccessibilityService, you need to:

  • Implement a preferences screen for your application.
  • Implement this with intent filter:
                
    
                <intent-filter>
                    <action android:name="android.intent.action.MAIN" />
                    <category android:name="android.accessibilityservice.SERVICE_SETTINGS" />
                </intent-filter>

Share And Enjoy,

Tuesday, August 24, 2010

Eyes-Free Review: Droid2 From MOT

Here is a quick eyes-free access overview of the MOT Droid2.

Hardware

  1. The device has a pull-out keyboard, and the buttons are much more tactile than the original Droid.
  2. The device also has dropped the hard-to-use D-Pad from the original Droid in favor of PC-style arrow keys.
  3. There is once again no dedicated number row at the top.
  4. The capacitive buttons on the front of the device appear in a different order from the original Droid --- with the device in portrait mode, reading left to right you have: Menu, home, back, and search.
  5. In addition, MOT ships a voice search application on the device that is triggered by pressing a special microphone button -- it's worth learning the position of this key, since voice-search can be useful --- and more importantly, if you're relying on spoken feedback, hitting this button leads to the phone falling inexplicably silent.

Software

If you look under accessibility, you'll find an application called Voice Readouts from MOT. This appears to be a screenreader analogous to TalkBack, though in my experience, it did not produce spoken feedback in many instances. That said, this application collaborates well with TalkBack --- and after installing TalkBack from the Android Market (note: the Droid2 does not come with TalkBack bundled) -- you can activate both TalkBack and VoiceReadout for an optimal experience.

VoiceReadout appears to have a preliminary version of touch-exploration. With VoiceReadout active, a single tap speaks the item under the finger; a double-tap activates that item. Note that moving the finger around on the display does not appear to trigger touch exploration; also, touch exploration appears to be available in only some contexts.

Instances where touch exploration appears to be active

  1. Settings application.
  2. Portions of Android Market.

In general, touch exploration appears to be available in ListView.

In addition, the Droid2 also includes a low-vision accessibility tool called Zoom Mode ( look for it under Settings -> Accessibility ) this tool provides a magnification lens.

Summary

All in all, the Droid2 appears to be one of the better choices for eyes-free use from among the presently available crop of Android phones. Touch exploration, though preliminary, is nice to see on the platform, and the bundled low-vision magnification aid is a nice touch. Voice Readouts is also a great example of an Android accessibility service done right in that it co-exists peacefully with other screenreaders like TalkBack to provide an optimal end-user experience. To users not familiar with adaptive technologies in general, this might not sound like a big deal --- but users of PC screenreders have long been familiar with the need to have only one screenreader turned on. As we transition to modern platforms like Android, it's useful to remind ourselves that screenreaders can in fact co-exist, with each tool providing something useful to create an overall experience that is greater than the sum of the parts.

Monday, July 12, 2010

Welcoming Loquendo Susan To Android (FroYo)

Android 2.2 (AKA FroYo) introduces many platform enhancements, and one that I find particularly relevant is the ability to plug-in additional Text-To-Speech engines. What this means from an end-user point of view:

  • Android comes with a set of built-in voices since Android 1.6 --- these are the Pico voices for English, French, Italian, German and Spanish.
  • With the Text-ToSpeech plug-in mechanism in place, we can now add new engines to the platform.
  • The first such add-on was ESpeak, which brings support for many of the world's languages.
  • And now, vendors are able to sell high-quality add-on voices via the Android Market.
  • Loquendo Susan is the first commercially available voice for Android. Users running FroYo can buy this voice on the Android Market. Thanks to the plug-in mechanism, once you buy a new voice, you can switch all your talking applications to use the newly installed voice --- see instructions below.

Activating And Using Newly Installed Voices

Goto Settings → Voice Input And Output → Text To Speech Settings. First, activate the newly installed voice by clicking the corresponding checkbox item for that voice. Next, go to Default Engine in the Text To Speech Settings menu, and make the newly installed voice your default engine. Finaly, if you want all applications to use the new voice, check option Always use my settings

With this in place, my Nexus and Droid both speak using Loquendo Susan --- thus turning my Android into into a truly pleasant eyes-free device.

Tuesday, May 25, 2010

Stitcher And TalkBack: The World In My Ears

Shortwave Radio --- and DXing was one my hobbies growing up--- I spent many hours listening to far-off radio stations ---and in the process developed a love for languages. Fast forward to the late 90's, and one could now listen to radio stations from all around the world on the Internet --- but this time without the hiss and static of shortwave propogation. But there was a catch --- you needed to be at your computer to listen to these stations. At home, I solved this problem by setting up a set of living room speakers connected to the computer in my office-bedroom; with a wireless keyboard, this brought Internet radio to my living room.

Fast-forward to the next decade, and I now have the Internet in my pocket in the form of a smart phone. I recently discovered Stitcher on the Android Market --- and it got me the final mile to having ubiquitous access to Internet Radio!

Using Stitcher With TalkBack

There is little more to say other than try it out! .Stitcher on Android is a simple Android application that worksout of the box with TalkBack. Once you install stitcher fromMarket, use the arrow keys or trackball on your phone to browse through the various categories. Clicking on stations launchesplayback immediately. Note that for now, the stop buttonin the player is not navigable by the trackball --- I have gotten used to hitting it by dead-reckoning since it always appears in afixed position. In the last few weeks, stitcher hasreplaced StreamFuriously , my previous Internet Radio solution on Android.

So here's to happy listening!A brief note on the title of this post --- The World In My Ears was also the title of abook on DXing by Arthur Cushen from New Zealand --- I remember hearing his voice in the 80's on the BBC's World Service.

Thursday, May 20, 2010

An Eyes-Free View Of Android At The Google IO Sandbox

Google IO 2010 is playing home to over 5,000 attendees in San Francisco this week. A number of Google Access engineers are at the conference consuming and producing information --- here is a brief view of some of the exciting bits seen on the Android show floor from an eyes-free perspective.

Hardware And New Devices From An Eyes Free Perspective

Many of the phone manufacturers were showing off their latest devices on the show floor --- visit the Android Sandbox at Google IO to see these first hand. Charles and I walked through the various displays Wednesday (May 19) afternoon to test drive these devices first-hand --- given the large number of Android devices coming out every week, this was a unique opportunity to see many of these devices for the first time. Here are some highlights:

  • All devices were running Android 1.6 or later, and consequently, Settings/Accessibility was available on every device. Having worked on this for the last 2 years, it's extremely gratifying to see phone manufacturers including accessibility in their devices.
  • We found one device from Motorola where we couldn't find the accessibility setting --- the booth representative promised to check after we pointed this out --- waiting to hear back.
  • My favorite device was the LG Ally --- check this device out if you get a chance.
    • Device to be sold by Verizon.
    • Device has an elegant tactual feel.
    • Front of the device sports hardware answer/hangup buttons.
    • The pull-out qwerty keyboard is a pleasure to use --- I would rate this one of the best designed cell phone keyboards I've seen.
  • Android devices continue to show up in many shapes and sizes --- re-emphasizing that there is a device for everyone. This makes it even more important to choose a device that meets your particular needs.

Software --- Android Applications Galore

We also visited the various vendors showing off their latest Android applications. What was gratifying was that even though most of these developers had paid little thought to eyes-free use --- and were blissfully unaware of the existence of an Android Accessibility API, their applications worked for the most part with Accessibility enabled. Where there were gaps, we were able to show developers what they needed to do --- everyone was extremely receptive. Below is a brief summary of what we saw --- and a shout-out to all the friendly developers we met:

Where

This is a very accessible application I have been using for a while --- the developers were thrilled to hear that it was accessible since they had made no special effort.

Aloqua

A competing application to Where with a very slick visual UI. This application doesn't raise the appropriate Access Events at present because it's a custom UI. When we first talked to their lead developer he was extremely hesitant saying I dont want to change my custom UI. However, I could hear his face light up when we said You dont need to change your look and feel --- you just need to set a couple of custom Java properties (specifically, property ContentDescription

Pandora

Another favorite of mine that works well with access --- except --- the player controls are unlabeled. I showed them the application in action on my Droid --- looking forward to seeing this application become even more usable.

NPR News

There are many NPR tools on the Android Market --- NPR News is the official application. The application was originally written by a Googler and Open Sourced --- I have been using it for about 4 months and it's completely accessible. It could do with some power-user shortcut keys to make it even more efficient.

MLB At Bat

I had originally played with this application during last year's World Series; at the time, the application was quite usable with TalkBack. I'm happy to report that nothing has regressed --- the application still continues to to work well, except for a couple of glitches with unlabled player controls. The booth representatives had actually heard of accessibility --- and were receptive to fixing the remaining issues.

Summary: The light-weight design of the Android Access layer has proven valuable in making sure that it makes it on to every device. The minimal set of responsibilities the API places on developers has meant that a large number of Android applications are accessible out of the box.

Tuesday, May 18, 2010

Audio Books On Android --- Thanks Librivox!

In my previous article , I alluded to an Audio Books application forAndroid. I did not go into much detail on the application itselfbecause I felt it deserved an article of its own.So heregoes!

In Praise Of Librivox

If you aren't familiar with the Librivox project, please visit Librivox.org to see the wonderful work that that project is doing. Androidapplication AudioBooks brings the wonders of Librivoxto Android --- now, you can carry all 30,000 audio books andcounting in your pocket and access them anywhere .Here are some highlights:

  • Browse, and quickly play available audio books. You canbrowse by several criteria.
  • Books you listen to get downloaded to your device and areavailable for offline listening.
  • All books provide a table of contents, allowing you to jumpto a specific portion of a book.
  • 90% of the application user interface is completelyaccessible with TalkBack --- see below for missing accessfeatures.

The only glitche with using application AudioBooks with the Android Access API is that the player controls withinthe audio-book player are presently missing contentdescriptions --- this is Android-API speak to say that thecontrols are images with missing labels. So the first time youuse this app, you'll need someone to tell you the buttons ---alternatively just experiment to discover theirfunctions. There are pause, play, rewind and forward buttons ---if the friendly folk who developed this application stumble uponthis post, please get in touch, and I can show you what you needto add to your code to make the eyes-free experience evensmoother.

Happy Listening --- And Share And Enjoy!

Monday, May 17, 2010

Using Android Market Eyes-Free

The Android Market is a treasure-trove of applications --- many of which work out of the box with Android's Access API, and as a result, the freely available screenreaders on the platform. Working with Market can be initially daunting, given the large collection applications; additionally, there are a couple of spots in the workflow that need access improvements. While we get those fixes pushed, here is a step-by-step overview of using Android Market with TalkBack, including the work-arounds for moving over some of the afore-mentioned hurdles.

Android Market: A Brief Overview

Rather than giving a detailed explanation of all of Android Market's user interface, I'll sketch my day-to-day mode of using Market --- personally, I find task-oriented help guides far more usable.

Task: Find Application
  • I typically launch Android Market from within the Applications list in the Eyes-Free shell. On my Droid, I typically do this with the keyboard already opened since I know I'll be typing very soon.
  • I press the Search capacitive button on the bottomright of the display to bring up the search tool. Note that Market can sometime take a few seconds to launch depending on your network --- TalkBack should announce Market when it's ready.
  • Type a search query --- as an example, try audio books
  • Use the D-Pad arrow keys (up/down) to navigate the list of results. TalkBack speaks each entry as you move through the list.
  • Find one you like; for this example, we'll use one of my favorite Market applications --- AudioBooks from project Librivox.
  • Press the Enter key on the keyboard to open this application
  • This takes you to a screen that lists a short description, and comments from various users on the application. The install button is on the bottom of this screen.
  • And here comes the sticking point in the Market UI that we're working on fixing; when you cursor through this list, you dont always get to the install button.But no fear, you can still install the application!
  • While we work on creating and pushing the fix for the above, I typically install applications by tapping the screen where the install button appears. The bad news is that Ipresently do this by dead reckoning; the good news is that the install button always appears at a consistent spot. The easiest way to learn to do this is to have someone put your finger on the button the first time, and then learn its position relative to the pull-out keyboard. While we know that this is not an ideal eyes-free experience, this little trick opens up a treasure-trove of applications.
  • Tap the install button, and you come to the permissions screen. Cursor to the OK button, and press Enter Depending on the layout of that screen, you may once again need to use dead-reckoning. At this point, I routinely click those on-screen buttons, rather than wasting time attempting to cursor to the button.
  • And voila, the AudioBooks application should download and install!
Task: Browse Market

In addition to searching, you can also browse the Market for available applications, use the cursor keys on the D-Pad for browsing. Once selected, installing an application follows the same workflow as above.

And The Best Is Yet To Come

Once installed, you can try out the application by pulling down the status bar. Look for the next posting in this series for details on using application AudioBooks --- it is one of my all time Market favorites.

Thursday, February 25, 2010

Eyes-Free: TalkBack And Shell Improvements

Here is a brief summary of updates to Android's eyes-free tools--- including TalkBack, and the Eyes-Free Shell from the last two weeks.

TalkBack

  • Speech during a phone call is now re-enabled.
  • Turning the screen on/off is spoken. This announcement includes the ringer mode/volume.
  • Changes in the the ringer mode - silent, vibrate, and normal are now announced.
  • Unlocking the phone is announced.
  • Other Android applications can programmatically discover if TalkBack is enabled.

Eyes-Free Shell

Now that applications can programmatically discover whetherTalkBack has been enabled, configuring Eyes-Free shell to become your default home screen has become a lot easier. In a nutshell,if you are a TalkBack user and install Eyes-Free shell, hitting the Home button will bring up the eyes-free shell, ---no configuration needed. Note that you can always get to the default Android home screen by long-pressing the Back button.

Share And Enjoy

Friday, February 12, 2010

Eyes-Free Updates: Marvin And TalkBack Simplified

We routinely push updates to our access tools on Android; users get these updates automatically via Android Market updates. We just pushed out updated versions of TalkBack, our Open Source screenreader for Android, and Marvin, the Eyes-Free shell. Here is a brief summary of these updates:

  • Android applications can now programmatically discover if TalkBack is running, thanks to the latest changes in TalkBack. From an end-user perspective, this means that you no longer need to configure Eyes-Free shell via EyesFreeConfig to be the default home. If you run TalkBack, and have EyesFree Shell installed, then pressing Home automatically gives you the EyesFree Shell. Remember, you can always get to the default Android Home by long-pressing Back.
  • EyesFree Shell now includes a touch-based shortcuts manager. Until now, shortcuts needed to be explicitly configured by editting an XML file on the SDCard. With the recent EyesFree update, you can interactively define short-cuts via a touch-based ShortCuts manager. By default, we have assigned shortcut 1 to the ShortCuts manager; so to invoke this new feature, do:
    1. Stroke left (4 using stroke dialer notation) to enter the shortcuts screen.
    2. Stroke up and to the left (1 using stroke-dialer notation) to invoke application ShortCuts Manager.
    3. Use the trackball/D-Pad to configure each of the 8 available shortcuts.

Marvin: We hope this gives some minimal relief to the pain in all the diodes on your left side.

Monday, February 8, 2010

Silencing Speech With A Wave Of Your Hand On Android 2.0

Update To Android Access: TalkBack

Smart phones tend to be short on physical buttons --- even devices like the G1 or MotoRola Droid have very few buttons when the physical keyboard is not open. This provides interesting challenges when designing an efficient eyes-free interface --- especially given the old maxim Speech is silvern, but silence is golden!.Said differently, once you have built a system that talks back, the first thing you want to build is an efficient means of silencing spoken feedback.

Early versions of TalkBack on Android skimmed by without a stop speech button --- you basically moved from one activity to another,and the speech produced by the new activity effectively stopped ongoing spoken output. However, as we make more and more applications work seamlessly with our Access APIs, it's always been clear to us that we need a global stop speech gesture! Notice that I said gesture --- not key --- stopping speech is a critical function that we'd like to enable without having to pull out the physical keyboard, and something we'd like to have devices without a physical keyboard.

In the spirit of the dual to every access challenge is an opportunity to innovate, we recently launched a new experimental TalkBack feature on devices running Android 2.0. Devices on the Android 2.0 platform have a proximity sensor on the top front left corner of the phone --- this is typically used to lock the screen when you're holding the phone up to your ear when on a phone call. As the name implies, the proximity sensorfires when you get close to it --- you can activate it by waving your hand close to the top left corner of the phone. As an experimental feature, we have configured the latest version of TalkBack to silence ongoing speech if you wave your hand in front of the proximity sensor.

Note that this is a new, experimental feature --- it's something that we welcome feedback on our public Eyes-Free Google Group. We'd like to know if you accidentally activate stop speechbecause of this new feature. In having used it for a few weeks, I find that I am not triggering it accidentally --- but that might well be a function of how I hold the phone.

What Devices Does This Available On?

Note that at the time of writing, the devices that have a proximity sensor that I have used this on include:

  • MotoRola Droid from Verizon
  • Google NexusOne

Note that the G1 and other older Android devices did not have a proximity sensor.

Friday, January 22, 2010

1Vox --- Your Query Is Our Command

Video: 1Vox --- Your Query Is Our Command

1 Video: 1Vox --- Your Query Is Our Command!

Device Used: Motorola Droid on Verizon

Speech interface designers often express surprize at the the fact that the average blind user rarely if ever uses spoken input. But when you come down to it, this is not too surprizing --- given that the eyes-free user has speech output active, the overall system ends up talking to itself!

To show that these conflicts can be avoided by careful user-interface design, we demonstrate 1Vox --- our voice-search wizard for the Marvin Shell.

  1. You activate 1Vox by stroke 9 on the Marvin screen.
  2. You hear a spoekn prompt Search
  3. You hear a little auditory icon when the system is ready for you.
  4. You speak oft-used queries e.g., Weather Mountain View.
  5. You hear a short spoken snippet in response.

We called this widget 1Vox --- in honor of the Google onebox found on the Google Results page.

Author: T.V Raman <raman@google.com>

YouTube And TalkBack --- Entertainment On The Go

Video: TalkBack And YouTube

1 Video: TalkBack And YouTube

Device: Motorola Droid on Verizon

This video demonstrates searching for and playing YouTube videos with TalkBack providing spoken feedback at each step in the interaction.

  1. Launch YouTube from the Marvin Application launcher.
  2. The trackball can be used here to move through the list of videos.
  3. Pressing down on the trackball launches the selected video.
  4. Press menu key to enter the YouTube application menu.
  5. Click on Search with the trackball.
  6. Type a query into the edit field. TalkBack speaks as you type.
  7. Press Enter to perform the search.
  8. Scroll the results list with the track-ball.
  9. Click a desired result to start playing the video.

Author: T.V Raman <raman@google.com>

Using TalkBack With Google Maps

Video: TalkBack And Google Maps

1 Video: TalkBack And Google Maps

Device Used: Motorola Droid On Verizon

TalkBack provides spoken feedback as you use Google Maps. In this video, we will demonstrate typical maps tasks such as:

  1. Launch Google Maps using the Marvin application launcher.
  2. From within the Maps application, press the menu key.
  3. Select Search and type a query into the search field.
  4. Notice that I can type a partial query and have auto-completion based on previous searches.
  5. Press Enter to perform the search.
  6. Bring up the result list in ListView by touching the bottom left of the screen.
  7. Scroll through this list using the D-Pad.
  8. Click with the D-Pad (or enter) to select a business.
  9. Scroll through available options, and click Get Directions.

10.Click the Go button to get directions.

  1. Scroll with the trackball to hear the directions spoken.

In addition, you can also use Google Latitude to locate your friends.

Note that other Map tools such as Google Latitude are accessible from within the set of options that appear when you press the menu key.

Author: T.V Raman <raman@google.com>

TalkBack: An Open Source Android Screenreader

Video: Introducing TalkBack, An Open Source Screenreader

1 Video: Introducing TalkBack, An Open Source Screenreader

Device Used: Motorola Droid On Verizon

We briefly introduced TalkBack in the previous video while enabling Accessibility from the settings menu.Here, we show off some of this screenreader's features.

TalkBack is designed to be a simple, non-obtrusivescreenreader. What this means in practice is that you interactdirectly with your applications, and not withTalkBack. TalkBack's job is to remain in the background andprovide the spoken feedback that you need.

TalkBack works with all of Android's native user interfacecontrols. This means you can configure all aspects of the Androiduser interface with TalkBack providing appropriate spokenfeedback. What is more, you can use most native Androidapplications --- including those downloaded from the AndroidMarket with TalkBack providing spoken feedback.

Here are some examples of Android applications (both from Google as well as third-party applications available onmarket) that work with TalkBack:

  • Google Maps: Perform searches, and listen to directions.
  • YouTube: Search, browse categories and play.
  • Simple Weather: Listen to local weather forecasts.
  • Facebook: Moving around on the social Web.

But in this video, we'll demonstrate the use of a very simple butuseful Android application --- the Android Alarm clock.

  • Launch: I launch the alarm clock from Marvin's eyes-free application launcher.
  • TalkBack: TalkBack takes over and starts speaking.
  • Navigate: Navigating with the trackball speaks the alarmunder focus.
  • Activate: Activating with the trackball produces appropriate feedback.
  • Navigate: Selected alarm displays its settings in a list-view which speaks as we navigate.

Author: T.V Raman <raman@google.com>

Introducing The Android Access Framework

Video: Introducing The Android Accessibility Framework

1 Video: Introducing The Android Accessibility Framework

Device Used: MotoRola Droid on Verizon

Starting with Android 1.6 --- fondly known as Donut --- the platform includes an Accessibility API that makes it easy to implement adaptive technology such as screenreaders. Android 1.6 comes with a built-in screenreader called TalkBack that provides spoken feedback when using Android applications written in Java.

The next few videos will progressively introduce TalkBack, SoundBack and KickBack, a suite of programs that augment the Android user interface with alternative output.

All of these special utilities are available through option Accessibility in the Android Settings menu. Once activated, the accessibility settings are persistent across reboots, i.e., you need enable these tools only once.

Notice that because I have accessibility enabled on my phone, all user actions produce relevant auditory feedback. Thus, each item is spoken as I move through the various options in the settings menu. The spoken feedback also indicates the state of an item as appropriate.

Activating SoundBack produces non-spoken auditory feedback; KickBack produces haptic feedback.

Author: T.V Raman <raman@google.com>

Connecting The Dots: Marvin And Android Access

Video: Connecting The Dots: Marvin And Android Access

1 Video: Connecting The Dots: Marvin And Android Access

When we first launched project eyes-free in early spring 2009, we promised to post frequent video updates to the eyes-free channel. Well, sadly, we have been remiss in keeping that promise --- but all in a good cause --- we were busy building out the needed accessibility APIs in the core Android framework.

We're now returning with a fresh set of video updates that demonstrate the new accessibility framework in Android, and how these access related tools mesh with the Eyes-Free shell shown earlier.

To summarize:

  1. All of the eyes-free utilities from project Marvin continue to be developed in order to provide fluent eyes-free interaction.
  2. The Marvin shell that we demonstrated last time continues to be my default home screen.
  3. We have added an application launcher on the Marvin screen that can be launched by stroking 8.
  4. This launcher uses stroke dialing to quickly navigate and launch applications.
  5. With the launch of the Accessibility API in Android 1.6, and the accompanying Open Source TalkBack screenreader, I can now launch any Android application, e.g., Google Maps or YouTube.
  6. TalkBack provides spoken feedback for native Android applications, including the settings menu.
  7. You can use Android Market to install third-party applications, many of these work outof the box with TalkBack.

We'll demonstrate these, and a variety of other new cool enhancements in these forthcoming videos, stay tuned!

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Thursday, January 21, 2010

Eyes-Free Home: The Marvin Shell

Video: Eyes-Free Home: The Marvin Shell

1 Video: Eyes-Free Home: The Marvin Shell

Device Used: T-Mobile G1 from HTC

The Marvin shell pulls together available eyes-free applications to provide an integrated user experience. Note that talking applications can come from many sources, with project Eyes-Free being but one such source. For other exciting talking applications that use our open Text To Speech (TTS) APIs, see the Android Marketplace, where you will find many useful tools that integrate seamlessly with Marvin.

when you install the Eyes-Free Shell, you can choose to make Marvin your default home screen --- this means that pressing the home button always brings up the Marvin shell. To return to the default Android home screen, hold down the back button for 3 seconds or more. Here is a brief description of the Marvin user interface.

1.1 Single Touch Access To Useful Tools

The Marvin shell uses the Stroke Dialer to provide single touch access to useful tools right from the home screen. You can explore this interface by moving your finger around the screen --- as you move over the buttons, Marvin speaks the associated action. Lifting up the finger executes the current action. As an example, the top row of the keypad, i.e., 1, 2, and 3 provide status information. Stroking to 4 brings up your favorite short-cuts, and 6 speaks your current location using geo-location information obtained from Google Maps. Pressing 7 connects to your voice-mailbox, and pressing 9 invokes Voice Search to obtain quick spoken answers from Google e.g., current weather for your location. Finally, the applications that appear on the shortcuts screen can be customized by editing XML file

/sdcard/eyesfree/shortcuts.xml
on your SD-Card --- as is apparent, this is a power-user feature:-)!

2 Talking Mini-Applications For Single Touch Access

Here, we demonstrate some of the talking mini-applications that can be accessed from the Marvin screen. All of these mini-applications speak useful information without the need for the user to do some form of context switch.

2.1 Device State

Available from 1 on the Marvin screen, this mini-application announces useful information such as signal strength, and availability of WiFi networks.

2.2 Date And Time

Available on 2 on the Marvin screen, this mini-application provides single-touch access to current date and time.

2.3 Battery State And Power

Pressing 3 on the Marvin screen speaks the current battery level and announces if the phone is presently being charged.

2.4 Knowing Your Location

Available as 6 from the Marvin home screen, this mini-application announces your present location based on information acquired via GPS and the cell network. It speaks your current heading using the built-in magnetic compass, looks up the current location on Google Maps, and announces the location in terms of a nearby address and street intersection.

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Talking PhoneBook: Eyes-Free Communication Device

Video: Talking Phonebook: Eyes-Free Communication Device

1 Video: Talking Phonebook: Eyes-Free Communication Device

Device Used: T-Mobile G1 from HTC

Pressing the menu button while in the Talking Dialer toggles between dialing mode and phonebook. When in phonebook, you get eyes-free access to your contacts with the ability to quickly move to the contact that you wish to call.

When in the phonebook, you can scroll through your contacts and press the call button to call the current contact. In addition, you can use stroke dialing as explained below to quickly move to a specific contact.

1.1 Entering Letters Using Stroke dialing

We covered eyes-free input with the touch screen in the earlier video on stroke dialing --- in that video, we illustrated the concept via a traditional phone keypad. Here, we extend that technique to enable textual input. In the explanation below, we will use compass directions to help with orientation. As before, we will use relative positioning i.e., for the rest of this explanation, you can start anywhere on the touch-screen --- though we recommend (for reasons that will become evident) that you start somewhere close to the middle of the screen.

1.2 The Eight Compass Directions

Defining the center as where you first touch down on the screen, notice that you can stroke in any one of the 8 compass directions, and that opposite pairs of compass directions e.g., North and South, can be thought of as opposites. So we get 4 pairs. We enumerate these below, associate them with the 4 Google colors, and equate them to their equivalent strokes from the stroke dialer:

  • Red: North-West and South-east 1 and 9.
  • Blue: North and South --- 2 and 8.
  • Green: North-East and South-West --- 3 and 7.
  • Yellow: East and West --- 4 and 6.

Now, let's place the letters of the alphabet on these 4 circles as follows:

  • Red: A ... H
  • Blue: I ... P
  • Green: Q ... X
  • Yellow: Y ... Z.

To input a given letter, we stroke to the circle containing the desired letter, trace along the circle till we hear the letter we want, and lift up the finger to make the selection. Letters are spoken in a female voice while moving along the selected circle; lifting up the finger speaks the selected letter in a male voice.

Notice that conceptually, we have defined a fairly simple mapping from strokes to letters of the alphabet!

1.3 Skimming The Contact List

So to cut a long story short, you dont need to scroll through the contact list. To quickly jump to a contact, use the technique described above to input the first letter from the contact's name --- the aplication jumps to contacts starting with that letter. At that point, you can either scroll, or enter additional letters to further filter the contact list.

1.4 Examples Of Using Strokes For Letters

Notice from the mapping shown earlier that we can enter each circle either at the top or bottom. Thus, entering the red circle at the top gets to A, while entering it at the bottom gets us to E. This means that the 8 letters on any given circle are no more than 3 steps away --- for example, to enter C, one needs to trace clockwise from A, or counter-clockwise from E. As an example, H is only 1 step from A on the red circle. similarly, P is only 1 step from I on the blue circle.

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Talking Dialer: Eyes-Free Communication Device

Video: Talking Dialer: Eyes-Free Communication Device

1 Video: Talking Dialer: Eyes-Free Communication Device

Device Used: T-Mobile G1 from HTC

The MotoRola Droid does not have a call button. Instead, Use the search capacitive button, i.e. the button on the extreme right, in place of the call button.

So now, let's use the stroke dialer for something practical --- let's make phone calls with our smart phone! Well, we know Marvin would disapprove if we just made phone calls, so rest assured, we'll do a lot more later!

Pressing the call button on Android phones launches the built-in dialing application. When using the Marvin shell, pressing this button launches the Talking Dialer application --- if you are not using Marvin as your home screen, you can launch this dialer as you would launch any Android application.

The Talking Dialer announces dialing mode upon start up. You can start dialing using the technique described in the previous video on stroke dialer --- if you make a mistake, simply shake the phone to erase. Once you have finished dialing, press the call button to initiate the call. The application speaks the number you're about to dial, and makes the call once you press the call button to confirm. But you say

Dialing phone numbers is so passe'!

--- well, there is still hope for the Talking Dialer. In addition to dialing mode, the Talking Dialer provides an easy to use Talking Phonebook that provides eyes-free access to your contact list --- we will cover this in our video on the talking phonebook.

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Stroke Dialler For Android

Video: Stroke Dialer For Android

1 Video: Stroke Dialer For Eyes-Free Keypad Input

Device Used: T-Mobile G1 from HTC

The stroke dialer enables one-handed keypad input using the touch-screen --- and that without having to even look at the screen. Here is how it works --- we start with a brief description of the problem that asks the right question. The answer becomes self-evident as you follow this video.

1.1 The Problem

On-screen keyboards typically show some buttons on the screen that you activate by touching the screen. To activate such buttons, one needs to look at the screen, because the buttons are placed at specific points on the screen, i.e., they are absolutely positioned. So what if you want to activate such buttons without looking at the screen? From the foregoing description, it's clear that the only reason one is forced to look at an on-screen keyboard is because the buttons are absolutely positioned. So let's relax that constraint, let's use relative positioning to place the buttons.

We'll start with a keyboard we're all familiar with, the telephone keypad. Since we're using relative positioning, let's place the center of the keypad wherever you first touch the screen. So, to dial a 5, you just touch the screen.

Now, you know where 5 is --- it's where you first touch down. But look, since you know the layout of a phone keypad, you can now find all the other digits relative to the 5. So for example, 2 is directly above 5 --- so to press 2, you touch down on the screen, and stroke up before lifting your finger. similarly, you stroke down for an 8, or diagonally up for a 1.

In real life, we both hear and feel as we press physical buttons. This form of synchronized auditory and tactile feedback is essential for creating user interfaces that feel realistic. The stroke dialer produces a slight vibration as the finger moves over the various buttons that is synchronized with an auditory tick to achieve this effect. It also produces spoken feedback to indicate the button that was pressed.

To conclude this video, let's dial a few numbers.

Author: T.V Raman <raman@google.com>

Introducing Marvin --- Eyes-Free Interaction On Android

Android Eyes-Free Introduction

1 Video: Introducing Project Eyes-Free For Android

Device Used: T-Mobile G1 from HTC

Project Eyes-Free turns your Android into an eyes-free communication device with one-handed, single-touch access to common tasks. Applications from this project can be used stand-alone; they can also be used together through the Eyes-Free shell. This collection of videos will cover the latter scenario.

We will refer to the eyes-free shell as Marvin in honor of Douglas Adams' famous paranoid android --- our Marvin says

Brain the size of a planet and they expect me to make phone calls?
The Marvin home screen provides single-touch access to useful information via a collection of talking mini-applications. In addition, commonly used applications can be placed under shortcuts for quick access. Finally, the call button automatically launches the eyes-free Talking Dialer --- all of these applications are covered in detail in subsequent videos.

Author: T.V Raman <raman@google.com >

An Introduction To YouTube Channel EyesFreeAndroid

The next set of articles on this blog cover the videos we have posted to channel EyesFreeAndroid on YouTube. Each article links to a particular video that highlights a given aspect of eyes-free interaction on Android using the built-in screenreader and related access tools.In the future, I'll make sure to post such descriptions as soon as the videos are uploaded, so watch this space! ( at the time the videos were posted last year, I did not have this blog)

Tuesday, January 19, 2010

Eyes-Free G1 --- My First Talking Android!

In the first article in this series, I'll cover the T-MobileG1 from HTC, my first accessible Android.Note: I've since moved on to the MotoRola Droid, but that is fora future article in this series.

I'll try to use a consistent outline for these articles where possible --- in general, you can expect articles covering a particular Android device have separate sections that address the hardware and software. Note that the softwware bits --- the Eyes-Free Marvin Shell and our free screenreader TalkBack, our common across all all Android devices.

The G1 Device And Eyes-Free Use

Here is a brief summary of my experience with the G1hardware:

  • The G1's keyboard is easy to use once you get used to thelayout, you can effectively touch-type with two thumbs.
  • It is possible to do many functions without having to pullout the keyboard, thanks to the track-ball and buttons on thefront panel.
  • The front panel has 5 buttons and a trackball:left-to-right, these are:Call, Home, Menu, Back, andHangup.
  • The menu button is something you will use very oftenwith Android applications. When you try out a new application,pressing menu lets you explore the application via the track-ball.
  • The track-ball takes some getting used to, it can move overmultiple items in lists if one isn't careful.
  • This was the first time I used a touch-screen, and the G1opened up many user-interface innovations.

Eyes-Free: Marvin Shell And TalkBack On G1

The Marvin Shell is my default home shell on all my Androiddevices. Note that TalkBack works fluently with the defaulthome-shell that comes with Android; however the Marvin Shell hassome nice touches that make it ideal for efficient eyes-free use--- for examples, see YouTubechannel EyesFreeAndroid.Here is a brief summary of my G1 setup, along with examples ofperforming some sample tasks. A word of caution first on whatdoesn't work yet:The browser is not yet TalkBack-enabled, and as aconsequence, browser-based applications such as GMail will notwork (yet).

  • I have option accessibility checked (see theAndroid settings menu). Within that same menu, I have TalkBack,SoundBack and KickBack enabled.
  • I also have the Eyes-Free Shell available on the AndroidMarket installed, along with the suite of Eyes-Free applicationsthat accompany it.
  • Pressing the Home button on the front panel switchesto or restarts the Eyes-Free Shell.
  • Many common actions can be performed by touch-gestures on theEyes-Free Shell, see the relevant YouTube Video.
  • You can enter Marvin's application launcher bystroking down on the home screen. Once in that launcher, you canuse the circle dialer to quickly jump to a particularapplication; you can scroll the list with the track ball. Onceyou've found an application, you the call button on thefront panel to launch the application.
  • Here is the StrokeDialer for keypad input in action. As an example, I strokeright to get a Y and that selects the YouTubeapplication. Launch it by pressing call on the frontpanel.
  • When you launch the YouTube application, TalkBacktakes over --- as the end-user, you continue to get spokenfeedback and typically are never aware of the transition.
  • Note that many Android applications use the touch screen forrapid interaction. Taking a few minutes to get oriented with thetouch controls for an application you plan to use often can make task completion more efficient. Caveat: we dont yet have an exploration widget to aid in this --- typically, I've had the user interface described to me. Notice that once you know that the YouTube UI uses a landscape orientation and that the bar for controling playback appears on the bottom, you can easily use your finger to slide along the bottom of the screen to control playback.
  • TalkBack provides fluent spoken feedback for many commontasks, such as Instant Messaging using Google Talk, or for SMSusing the built-in Messaging application.
  • Another useful Android feature to leverage is the StatusBar --- here is where applications post notifications,e.g. a missed call, or an upcoming calendar appointment.
  • You open up status bar by bringing it down--- think of it as pulling down a screen. Place your finger atthe top of the screen and stroke all the waydown.
  • You can now use the track-ball to scroll through anyavailable notifications and hear them spoken. This isparticularly useful with Google Calendar.

And of course, there is much more to say than will fit in asingle blog article.

Sunday, January 17, 2010

Welcome To Eyes-Free Android

I'll blog about my use of Android phones here. The tools I use are being developed as part of project Eyes-Free and you can meet up with other users on GoogleGroup Eyes-Free. All code developed as part of project Eyes-Free is Open Source, and the core Access API, and associated adaptive technology is part of the Android platform starting with Android 1.6.

What You Can Expect To See On This Blog

Android runs on a variety of mobile phones and devices vary with respect to their various hardware features, e.g., keyboards, trackball etc. This blog will focus on tips and tricks for getting the most out of various Android devices, based on my personal experience.