tag:blogger.com,1999:blog-91067057766679908762024-03-17T02:14:47.531-07:00Eyes-Free AndroidThis blog details my use of various Android devices with Android
Accessibility turned on. In combination with the eyes-free shell,
this turns Android into a personal communication device that aids in independent living.
andT. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.comBlogger35125tag:blogger.com,1999:blog-9106705776667990876.post-82909494884366548002013-12-19T16:28:00.001-08:002013-12-19T16:28:51.441-08:00Just Speak Updated With More Hands-Free Functionality
<div xmlns='http://www.w3.org/1999/xhtml'>
<div id='content'>
<h1 class='title'> *Just Speak*: Controlling Android By Voice</h1>
<div class='outline-2' id='outline-container-sec-1'>
<h2 id='sec-1'><span class='section-number-2'>1</span> JustSpeak: Controlling Android By Voice</h2>
<div id='text-1' class='outline-text-2'>
<p>
<b>JustSpeak</b> is an Android <i>Accessibility Service</i> that enables
voice control of your Android device. Once enabled, you can
activate on-screen controls, launch installed applications, and
trigger other commonly used Android actions using spoken
commands.
</p>
</div>
<div class='outline-3' id='outline-container-sec-1-1'>
<h3 id='sec-1-1'><span class='section-number-3'>1.1</span> Enabling <b>Just Speak</b></h3>
<div id='text-1-1' class='outline-text-3'>
<p>
Once installed, you can enable the <b>Just Speak</b> service under
<i>Settings->Accessibility</i> on your Android device. Enabling the
service is an action that only needs to be performed once as
<b>Just Speak</b> will be automatically restarted when the phone is
rebooted.
</p>
<p>
<b>Just Speak</b> is an Accessibility Service that uses Acessibility
APIs on the platform to augment Android's user interface. <b>Just
Speak</b> augments the Android user interface with voice-input
control; other Accessibility Services like TalkBack provide
spoken feedback. Note that <b>Just Speak</b> can be used either by
itself, or in conjunction with other platform Accessibility
Services such as TalkBack.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-2'>
<h3 id='sec-1-2'><span class='section-number-3'>1.2</span> Initiating Voice Recognition</h3>
<div id='text-1-2' class='outline-text-3'>
<p>
Once <b>Just Speak</b> is installed and enabled on your device, you
can initiate voice commands by either performing an up-swipe from
the Home button (if your device has soft keys) or by performing
multiple taps on the Home button (if your device has hard
keys). Note that this is the same gesture that activates Google
Now on devices running the stock version of Android;
When using <b>Just Speak</b> , you can get to Google Now by saying
<i>Launch Google Now</i>.
</p>
<p>
Successful invocation of <b>Just Speak</b> starts voice recognition;
this is indicated by playing an auditory icon (accompanied by
vibration if available) and a visual overlay. Depending on the
settings enabled in <b>Just Speak</b>, as well as other Accessibility
Services, received voice input may be spoken and/or displayed
after voice recognition has completed. As an example, TalkBack
can work in conjunction with <b>Just Speak</b> to speak the recognized
commands. <b>Just Speak</b> supports both <i>local</i> and <i>global</i>
commands as described in later sections of this document.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-3'>
<h3 id='sec-1-3'><span class='section-number-3'>1.3</span> Cancelling Voice Recognition</h3>
<div id='text-1-3' class='outline-text-3'>
<p>
Voice recognition can be stopped by performing the same action
used to initiate voice recognition (either performing an up-swipe
from the Home button or by performing multiple clicks of the Home
button). <b>Just Speak</b> is also programmed to stop listening if no
voice input is received after a specific amount of time. Finally,
an overlay, that covers the entire screen, is displayed whenever
voice recognition is active; Clicking anywhere on this overlay
will dismiss the overlay and terminate voice recognition.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-4'>
<h3 id='sec-1-4'><span class='section-number-3'>1.4</span> Global Voice Commands</h3>
<div id='text-1-4' class='outline-text-3'>
<p>
Just Speak supports a set of global commands that are available
on any screen. These global commands include:
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<colgroup>
<col class='left'/>
<col class='left'/>
<col class='left'/>
<col class='left'/>
</colgroup>
<tbody>
<tr>
<td class='left'>Command </td>
<td class='left'>Utterance </td>
<td class='left'>Synonym</td>
<td class='left'>Action </td>
</tr>
<tr>
<td class='left'>Open </td>
<td class='left'>Open <Installed Application></td>
<td class='left'>Launch, Run</td>
<td class='left'>Launch Application </td>
</tr>
<tr>
<td class='left'>Recent </td>
<td class='left'>Recent Apps </td>
<td class='left'>Recent Applications</td>
<td class='left'>Recent Applications </td>
</tr>
<tr>
<td class='left'>Quick Settings</td>
<td class='left'>Quick Settings </td>
<td class='left'>Open</td>
<td class='left'>Quick Settings </td>
</tr>
<tr>
<td class='left'>Toggle WiFi</td>
<td class='left'>Switch WiFi (On/Off)</td>
<td class='left'> </td>
<td class='left'>Toggle WiFi</td>
</tr>
<tr>
<td class='left'>Toggle Bluetooth</td>
<td class='left'>Switch Bluetooth (On/Off)</td>
<td class='left'> </td>
<td class='left'>Toggle Bluetooth</td>
</tr>
<tr>
<td class='left'>Toggle Tethering</td>
<td class='left'>Switch Tethering (On/Off)</td>
<td class='left'> </td>
<td class='left'>Toggle Tethering</td>
</tr>
<tr>
<td class='left'>Home </td>
<td class='left'>Go Home </td>
<td class='left'> </td>
<td class='left'>Return to home screen </td>
</tr>
<tr>
<td class='left'>Back </td>
<td class='left'>Go Back </td>
<td class='left'> </td>
<td class='left'>Return to previous screen </td>
</tr>
<tr>
<td class='left'>Notifications </td>
<td class='left'>Open Notifications </td>
<td class='left'> </td>
<td class='left'>Open notifications shade</td>
</tr>
<tr>
<td class='left'>Easy Labels</td>
<td class='left'>Easy Labels</td>
<td class='left'> </td>
<td class='left'>Display Easy Labels</td>
</tr>
</tbody>
</table>
<p>
Note that voice commands are flexible in that action words may be
substituted with synonymous verbs (eg: “launch” in place of
“open”). In addition, voice commands can be formulated as
complete sentences, e.g., "Please open GMail".
</p>
<p>
In addition, <b>Just Speak</b> provides the following spoken aliases
as a means of triggering commonly used applications:
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<colgroup>
<col class='left'/>
<col class='left'/>
</colgroup>
<tbody>
<tr>
<td class='left'>Utterance </td>
<td class='left'>Action </td>
</tr>
<tr>
<td class='left'>Browser </td>
<td class='left'>Launch default Web browser</td>
</tr>
<tr>
<td class='left'>Web </td>
<td class='left'>Launch default Web Browser</td>
</tr>
<tr>
<td class='left'>OK Google Now</td>
<td class='left'>Launch Google Now </td>
</tr>
<tr>
<td class='left'>Search </td>
<td class='left'>Launch Voice Search </td>
</tr>
<tr>
<td class='left'>Voice Search </td>
<td class='left'>Launch Voice Search </td>
</tr>
</tbody>
</table>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-5'>
<h3 id='sec-1-5'><span class='section-number-3'>1.5</span> Local Voice Commands</h3>
<div id='text-1-5' class='outline-text-3'>
<p>
In addition to the global commands that are available all
screens, <b>Just Speak</b> allows you to interact with on-screen
controls in a variety of ways.
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<colgroup>
<col class='left'/>
<col class='left'/>
<col class='left'/>
<col class='left'/>
</colgroup>
<tbody>
<tr>
<td class='left'>Command </td>
<td class='left'>Utterance </td>
<td class='left'>Synonym</td>
<td class='left'>Action </td>
</tr>
<tr>
<td class='left'>Activate</td>
<td class='left'>Click <control name></td>
<td class='left'>Click, Tap</td>
<td class='left'>Activate control by its on-screen name</td>
</tr>
<tr>
<td class='left'>Scroll </td>
<td class='left'>Scroll Up/Down </td>
<td class='left'>Forward, Backward</td>
<td class='left'>Scroll e.g., Lists. </td>
</tr>
<tr>
<td class='left'>Switch</td>
<td class='left'> </td>
<td class='left'>Switch On/Off Toggle</td>
<td class='left'>Toggle Switches</td>
</tr>
<tr>
<td class='left'>Long Press</td>
<td class='left'>Long Press</td>
<td class='left'>Long Click, Long Tap</td>
<td class='left'>Long Press on on-screen controls</td>
</tr>
<tr>
<td class='left'>Check</td>
<td class='left'>Check </td>
<td class='left'>Check, Uncheck</td>
<td class='left'>Toggle CheckBox values</td>
</tr>
</tbody>
</table>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-6'>
<h3 id='sec-1-6'><span class='section-number-3'>1.6</span> Labeling On-Screen Control</h3>
<div id='text-1-6' class='outline-text-3'>
<p>
When using <b>Just Speak</b>, the text labels that appear next to
on-screen controls determine the set of available local
commands. For many controls, such as images, checkboxes, and
switches, there may be no visible text to associate with the
control. In these instances, <b>Just Speak</b> uses underlying
Accessibility Metadata provided by the application developer to
construct relevant labels; note that this metadata is also used
by Accessibility Services such as TalkBack to meaningfully speak
on-screen controls.
</p>
<p>
We leverage the visual overlay that indicates that <b>Just Speak</b>
is active to visually display this additional metadata — this
serves as a hint as to what you can say to activate the available
controls. These overlay labels take one of two forms:
</p>
<ol class='org-ol'>
<li>Controls receives a centered label in the simple case
</li>
</ol>
<p>
where there are no actionable child controls that need
additional labeling.
</p>
<ol class='org-ol'>
<li>Where the control itself has actionable children,
</li>
</ol>
<p>
<b>Just Speak</b> displays a labled frame around the actionable
children.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-7'>
<h3 id='sec-1-7'><span class='section-number-3'>1.7</span> Chaining Commands</h3>
<div id='text-1-7' class='outline-text-3'>
<p>
<b>Just Speak</b> can be configured to take multiple voice commands at
once and perform them sequentially. This chaining works with both
local and global commands, performing the preceding action after
the previous action has been executed. Commands are chained via
simple connectives such as “and” and “then”. An example of this
would be
</p>
<pre class='example'>
“Click confirm and then go home”.
</pre>
<p>
In this case, <b>Just Speak</b> would
click the “Confirm” control (assuming it’s present) and upon
completion of that task, return to the Home screen.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-8'>
<h3 id='sec-1-8'><span class='section-number-3'>1.8</span> Easy Labels</h3>
<div id='text-1-8' class='outline-text-3'>
<p>
Another configurable setting in <b>Just Speak</b> is the ability to
replace the labels associated with on-screen controls with <i>easy
labels</i>. These are phonetic labels that are designed to be
unambiguous and are displayed in the overlay over their
respective controls, easily identifying what to say to interact
with a specific control. Note that <i>phonetic labels</i> can be
temporarily activated or deactivated via the <b>Just Speak</b> global
command <i>Easy Labels</i>.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-9'>
<h3 id='sec-1-9'><span class='section-number-3'>1.9</span> Persistent Overlay</h3>
<div id='text-1-9' class='outline-text-3'>
<p>
To aid users with limited dexterity, <b>Just Speak</b> also provides
the option to make the overlay persistent, capturing all touch
events received by the Android device. This effectively removes
all traditional interactions a user can have with their device,
replacing it with <b>Just Speak</b> functionality. The benefit of this
is that the entire device essentially becomes a button, toggling
between initiating and terminating voice recognition. In addition
to this, overlay labeling is always present, allowing you to
constantly be aware of what you can say to <b>Just Speak</b>.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-sec-1-10'>
<h3 id='sec-1-10'><span class='section-number-3'>1.10</span> Alternative Means Of Initiating Voice Recognition</h3>
<div id='text-1-10' class='outline-text-3'>
<p>
We are continuing to experiment with alternative ways of
initiating voice recognition. Toward this end, <b>Just Speak</b> enables
you to configure an NFC tag that can then be tapped to initiate
voice control; see “Just Speak -> Settings” for configuring an
NFC tag.
</p>
</div>
</div>
</div>
</div>
<div class='status' id='postamble'>
<p class='date'>Date: 2013-08-06 Tue</p>
<p class='author'>Author: T.V Raman</p>
<p class='date'>Created: 2013-12-18 Wed 13:50</p>
<p class='creator'><a href='http://www.gnu.org/software/emacs/'>Emacs</a> 24.3.50.1 (<a href='http://orgmode.org'>Org</a> mode 8.2.3a)</p>
<p class='validation'><a href='http://validator.w3.org/check?uri=referer'>Validate</a></p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com350tag:blogger.com,1999:blog-9106705776667990876.post-85815143403587698172013-09-26T16:17:00.001-07:002013-09-26T16:17:10.887-07:00JustSpeak: Control All Aspects Of Android By Voice
<div xmlns='http://www.w3.org/1999/xhtml'>
<div id='preamble'>
</div>
<div id='content'>
<h1 class='title'>JustSpeak: Controlling Android By Voice </h1>
<div id='table-of-contents'>
<h2>Table of Contents</h2>
<div id='text-table-of-contents'>
<ul>
<li><a href='#sec-1'>1 JustSpeak: Controlling Android By Voice</a>
<ul>
<li><a href='#sec-1-1'>1.1 Starting JustSpeak</a></li>
<li><a href='#sec-1-2'>1.2 Initiating Voice Commands</a></li>
<li><a href='#sec-1-3'>1.3 Global Voice Commands</a></li>
<li><a href='#sec-1-4'>1.4 Local Voice Commands</a></li>
<li><a href='#sec-1-5'>1.5 Chaining Commands</a></li>
<li><a href='#sec-1-6'>1.6 Alternative Means Of Triggering JustSpeak</a></li>
</ul>
</li>
</ul>
</div>
</div>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'><span class='section-number-2'>1</span> JustSpeak: Controlling Android By Voice</h2>
<div id='text-1' class='outline-text-2'>
<p>
<b>JustSpeak</b> is an Android <i>Accessibility Service</i> that enables
voice control of your Android device. Once enabled, you can
activate on-screen controls, launch installed applications, and
trigger other commonly used Android actions using spoken
commands.
</p>
</div>
<div class='outline-3' id='outline-container-1-1'>
<h3 id='sec-1-1'><span class='section-number-3'>1.1</span> Starting JustSpeak</h3>
<div id='text-1-1' class='outline-text-3'>
<p>
Once installed, you can start <b>JustSpeak</b> under
<i>Settings->Accessibility</i> on your Android device — you only need
to do this once, i.e., <b>JustSpeak</b> will be automatically
restarted when you reboot your phone.
</p>
<p>
<b>JustSpeak</b> is an <i>Accessibility Service</i> i.e., once started, it
uses Acessibility APIs on the platform to augment Android's user
interface. <b>JustSpeak</b> augments the Android user interface with
voice-input control; other Accessibility Services like TalkBack
provide spoken feedback. Note that <b>JustSpeak</b> can be used either
by itself, or in conjunction with other platform Accessibility
Services such as TalkBack.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-2'>
<h3 id='sec-1-2'><span class='section-number-3'>1.2</span> Initiating Voice Commands</h3>
<div id='text-1-2' class='outline-text-3'>
<p>
Once started, <b>JustSpeak</b> registers itself as an <i>Assistant</i> on
your device; you can initiate voice commands by swiping up from
the bottom of the screen. Note that this is the same gesture that
activates <i>Google Now</i> on devices running the stock version of
Android. When using JustSpeak, you can get to Google Now by
saying <i>Google</i>. Successful invocation of voice-control produces an
auditory icon, indicating that your device is ready to
listen. The auditory tone, along with an optional vibration (if
available) you hear is your cue to start speaking; <i>JustSpeak</i>
displays a visual overlay on the right-edge of the screen to
indicate that it is listening. the recognized utterance is
displayed visually, — additionally, it is spoken out if TalkBack
is active. <b>JustSpeak</b> supports two types of voice control
commands as described below.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-3'>
<h3 id='sec-1-3'><span class='section-number-3'>1.3</span> Global Voice Commands</h3>
<div id='text-1-3' class='outline-text-3'>
<p>
<b>JustSpeak</b> supports a set of <i>global commands</i> — these
are <i>global</i> in that they are available on <i>any</i> screen. Global
commands include:
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<colgroup><col class='left'/><col class='left'/><col class='left'/>
</colgroup>
<tbody>
<tr><td class='left'>Command</td><td class='left'>Utterance</td><td class='left'>Action</td></tr>
<tr><td class='left'>Open</td><td class='left'>Open <Installed Application></td><td class='left'>Launch Application</td></tr>
<tr><td class='left'>Recent</td><td class='left'>Recent</td><td class='left'>Recent Applications</td></tr>
<tr><td class='left'>Quick Settings</td><td class='left'>Quick Settings</td><td class='left'>Launch Quick Settings</td></tr>
<tr><td class='left'>Toggle WiFi</td><td class='left'>Switch WiFi (On/Off)</td><td class='left'>Toggle WiFi</td></tr>
<tr><td class='left'>Toggle Bluetooth</td><td class='left'>Switch Bluetooth (On/Off)</td><td class='left'>Toggle Bluetooth</td></tr>
<tr><td class='left'>Toggle Tethering</td><td class='left'>Switch Tethering (On/Off)</td><td class='left'>Toggle Tethering</td></tr>
<tr><td class='left'>Home</td><td class='left'>Go Home</td><td class='left'>Return to home screen</td></tr>
<tr><td class='left'>Back</td><td class='left'>Go Back</td><td class='left'>Return to previous screen</td></tr>
<tr><td class='left'>Notifications</td><td class='left'>Open Notifications</td><td class='left'>Pull down notifications shade</td></tr>
</tbody>
</table>
Note that voice commands are flexible in that you can use a
number of synonyms for verbs such as <i>open</i> e.g., <i>launch</i>. In
addition, voice control commands can be formulated as full
sentences, e.g., "Please open GMail".
<p>
List Of Synonyms For <i>Open</i>:
</p>
<ul>
<li>open
</li>
<li>go to
</li>
<li>show
</li>
<li>display
</li>
</ul>
<p>
In addition, <i>JustSpeak</i> provides the following spoken aliases
as a means of triggering commonly used applications:
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<colgroup><col class='left'/><col class='left'/>
</colgroup>
<tbody>
<tr><td class='left'>Utterance</td><td class='left'>Action</td></tr>
<tr><td class='left'>Browser</td><td class='left'>Launch default Web browser</td></tr>
<tr><td class='left'>Web</td><td class='left'>Launch default Web Browser</td></tr>
<tr><td class='left'>OK Google Now</td><td class='left'>Launch Google Now</td></tr>
<tr><td class='left'>Search</td><td class='left'>Launch Voice Search</td></tr>
<tr><td class='left'>Voice Search</td><td class='left'>Launch Voice Search</td></tr>
</tbody>
</table>
</div>
</div>
<div class='outline-3' id='outline-container-1-4'>
<h3 id='sec-1-4'><span class='section-number-3'>1.4</span> Local Voice Commands</h3>
<div id='text-1-4' class='outline-text-3'>
<p>
In addition to the global voice commands that are available on
any screen, <b>JustSpeak</b> lets you activate on-screen controls in
a variety of ways.
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<colgroup><col class='left'/><col class='left'/><col class='left'/>
</colgroup>
<tbody>
<tr><td class='left'>Command</td><td class='left'>Utterance</td><td class='left'>Action</td></tr>
<tr><td class='left'>Activate</td><td class='left'>Click <control name></td><td class='left'>Activate control by its on-screen name</td></tr>
<tr><td class='left'>Toggle</td><td class='left'>Turn on/off <toggle></td><td class='left'>Toggle on/off switch</td></tr>
<tr><td class='left'>Scroll</td><td class='left'>Scroll Up/Down</td><td class='left'>Scroll e.g., Lists.</td></tr>
</tbody>
</table>
List Of Synonym For <i>Activate</i>:
<ul>
<li>touch
</li>
<li>click on
</li>
<li>press
</li>
<li>activate
</li>
<li>open
</li>
<li>push
</li>
<li>tap
</li>
</ul>
</div>
</div>
<div class='outline-3' id='outline-container-1-5'>
<h3 id='sec-1-5'><span class='section-number-3'>1.5</span> Chaining Commands</h3>
<div id='text-1-5' class='outline-text-3'>
<p>
Note that you can issue a sequence of commands via a single
utterance by using simple connectives such as <i>then</i> or <i>and</i>
when speaking; as an example, you can say:
</p><pre class='example'>
Quick Settings then WiFi
</pre>
<p>to open "Quick Settings" and then click
the "WiFi" item that appears within quick settings.
This is an example
of chaining together a global command followed by a local
command that becomes available <i>after</i> the global command has
executed successfully. Similarly, you can say:
</p><pre class='example'>
Open GMail, then compose
</pre>
<p>to launch GMail, and immediately start composing a new
message.
</p></div>
</div>
<div class='outline-3' id='outline-container-1-6'>
<h3 id='sec-1-6'><span class='section-number-3'>1.6</span> Alternative Means Of Triggering JustSpeak</h3>
<div id='text-1-6' class='outline-text-3'>
<p>
We are continuing to experiment with alternative ways of
initiating voice control. Toward this end, <b>JustSpeak</b>
enables you to configure an NFC tag that can then be tapped
to initiate voice control; see <i>JustSpeak->Settings</i> for
configuring an NFC tag.
On Android 4.3 and above, you can also trigger <i>JustSpeak</i> by
long-pressing both volume buttons at the same time.
</p>
</div>
</div>
</div>
</div>
<div id='postamble'>
<p class='date'>Date: 2013-08-06 Tue</p>
<p class='author'>Author: T.V Raman</p>
<p class='creator'><a href='http://orgmode.org'>Org</a> version 7.9.3f with <a href='http://www.gnu.org/software/emacs/'>Emacs</a> version 24</p>
<a href='http://validator.w3.org/check?uri=referer'>Validate XHTML 1.0</a>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com195tag:blogger.com,1999:blog-9106705776667990876.post-39020540222213073592012-07-25T16:39:00.001-07:002012-07-25T16:39:00.643-07:00Jelly Bean: Accessibility Gestures Explained
<div xmlns='http://www.w3.org/1999/xhtml'>
<div id='content'>
<h1 class='title'>Jelly Bean: Accessibility Gestures Explained</h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'><span class='section-number-2'>1</span> Jelly Bean: Accessibility Gestures Explained</h2>
<div id='text-1' class='outline-text-2'>
<p>
This article details accessibility gestures in Jelly Bean and is
a follow-up to <a href='http://eyes-free.blogspot.com/2012/07/jelly-bean-accessibility-explainedtouch.html'>Jelly Bean Accessibility Explained</a>. It gives a
conceptual overview of the 16 possible gestures and describes how they are
used. The interaction behavior described here holds for all
aspects of the Android user interface, including interactive Web
pages within Chrome and the Android Web Browser.
</p>
</div>
<div class='outline-3' id='outline-container-1-1'>
<h3 id='sec-1-1'><span class='section-number-3'>1.1</span> Conceptual Overview Of The Gestures</h3>
<div id='text-1-1' class='outline-text-3'>
<p>
Placing a finger on the screen speaks the item under the finger
by first placing Accessibility Focus on that item.
Moving the finger triggers touch exploration which moves
Accessibility Focus.
</p>
<p>
To generate any of the Accessibility Gestures discussed below,
one moves the finger <i>much faster</i> — how much faster is
something we will tune over time, and if needed, make user
customizable.
</p>
<p>
To remember the gestures, think of the four directions, <i>Up</i>,
<i>Down</i>, <i>Left</i> and <i>Right</i>. In addition to these four basic
navigational gestures, we defined an additional 12 gestures by
picking pairwise combinations of these directional flicks, e.g.,
<i>Left then Down</i> — this gives a total of 16 possible gestures. In what follows, <i>left then down</i> means swipe
left, and continue with a down flick. Note that in performing
these additional gestures, speed matters the most. As an example,
it is not essential that you make a perfect <i>Capital L</i> when
performing the <i>Down then Right</i> gesture for instance; speed
throughout the gesture, as well as ensuring that the finger moves
some distance in each direction is key to avoid these being
misinterpreted as basic navigational gestures.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-2'>
<h3 id='sec-1-2'><span class='section-number-3'>1.2</span> Accessibility Focus And Accessibility Gestures</h3>
<div id='text-1-2' class='outline-text-3'>
<p>
Accessibility Focus is moved using the four basic directional
gestures. For now we have aliased <i>Left</i> with <i>Up</i>, and <i>Down</i>
with <i>Right</i>; i.e., both <i>Left</i> and <i>Up</i> move to the previous
item, whereas <i>Down</i> and <i>Right</i> move to the next item. Note that
this is <i>not</i> the same as moving with a physical D-Pad or
keyboard on Android; the Android platform moves System Focus in
response to the D-Pad. Thus, moving with a D-Pad or trackball
moves you through the various interactive controls on the screen;
moving Accessibility Focus via the Accessibility Gestures moves
you through <i>everything</i> on the screen.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-3'>
<h3 id='sec-1-3'><span class='section-number-3'>1.3</span> Accessibility Gestures For Common Actions</h3>
<div id='text-1-3' class='outline-text-3'>
<p>
In addition to the basic navigation describe above, we define the
following gestures for common actions:
</p>
<dl>
<dt>Navigation Granularity</dt><dd>You can increase or decrease
navigation granularity by rapidly stroking <i>Up then Down</i> or <i>Down then Up</i>.
</dd>
<dt>Scrolling Lists</dt><dd>You can scroll a list forward by
rapidly stroking <i>Right then Left</i>; the reverse, i.e.,
<i>Left then Right</i> scrolls the list backward by a screenful.
</dd>
</dl>
</div>
</div>
<div class='outline-3' id='outline-container-1-4'>
<h3 id='sec-1-4'><span class='section-number-3'>1.4</span> User Configurable Gestures</h3>
<div id='text-1-4' class='outline-text-3'>
<p>
Gestures <i>Down then Left</i>, <i>Up then Left</i>, <i>Down then Right</i> and <i>Up then Right</i>
are user configurable; their default assignments are shown below.
</p>
<dl>
<dt>Back</dt><dd>Gesture <i>Down then Left</i> is the same as pressing the
<i>Back</i> button.
</dd>
<dt>Home</dt><dd><i>Up then Left</i> has the same effect as pressing the <i>Home</i>
button.
</dd>
<dt>Status Bar</dt><dd>Gesture <i>Up then Right</i> opens Status Notifications.
</dd>
<dt>Recent </dt><dd><i>Down then Right</i> has the same effect as pressing the <i>Recent Applications</i> button.
</dd>
</dl>
</div>
</div>
<div class='outline-3' id='outline-container-1-5'>
<h3 id='sec-1-5'><span class='section-number-3'>1.5</span> Summary</h3>
<div id='text-1-5' class='outline-text-3'>
<p>
Gestures for manipulating and working with Accessibility Focus
are an evolving part of the Android Accessibility; we will
continue to refine these based on user experience.
At this point, you are probably saying:
</p><pre class='example'>
But wait, you said 16 gestures, but only told us the meanings of 12 of them
</pre>
<p>You are correct — we have left ourselves some gestures to use
for future features.
</p></div>
</div>
</div>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com296tag:blogger.com,1999:blog-9106705776667990876.post-78612718100292684292012-07-24T14:05:00.001-07:002012-07-24T14:05:47.820-07:00Jelly Bean Accessibility Explained:Touch Exploration Augmented By
Gestures
<div xmlns='http://www.w3.org/1999/xhtml'>
<div id='content'>
<h1 class='title'>Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures</h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'><span class='section-number-2'>1</span> Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures</h2>
<div id='text-1' class='outline-text-2'>
<p>
We announced a number of accessibility enhancements in Android
Jelly Bean — see our <a href='http://eyes-free.blogspot.com/2012/06/what-new-in-google-accessibility-from.html'>Google IO 2012</a> announcements and our
<a href='http://www.youtube.com/watch?v=q3HliaMjL38'>Android Accessibility</a> talk from I/O 2012. This article gives a
user-centric overview of the Jelly Bean interaction model as
enabled by touch exploration and navigational gestures. Note that
as with every release, Android Accesssibility continues to
evolve, and so as before, what we have today is by no means the
final word.
</p>
</div>
<div class='outline-3' id='outline-container-1-1'>
<h3 id='sec-1-1'><span class='section-number-3'>1.1</span> High-Level Concepts</h3>
<div id='text-1-1' class='outline-text-3'>
<p>
First, here's some shared vocabulary to ensure that we're all
talking of the same thing when explaining Jelly Bean access:
</p>
<dl>
<dt>Random Access</dt><dd>Enable user to reach any part of the
on-screen UI with equal ease. We enabled
this as of ICS with touch exploration.
</dd>
<dt>Deterministic Access </dt><dd>Enable user to reliably land on a
desired item on the screen. We enable this in Jelly Bean
with <i>linear navigation</i>.
</dd>
<dt>Accessibility Focus</dt><dd>The item that the user most recently
interacted with — either via touch exploaration or
linear navigation receives <i>accessibility focus</i>.
</dd>
<dt>Activation</dt><dd>User can activate item having <i>accessibility focus</i> by double-tapping <b>anywhere</b> on the
screen.
</dd>
</dl>
</div>
</div>
<div class='outline-3' id='outline-container-1-2'>
<h3 id='sec-1-2'><span class='section-number-3'>1.2</span> Sample Interaction Scenarios We Enable</h3>
<div id='text-1-2' class='outline-text-3'>
<p>
The combination of random access via touch exploration, backed up
by linear navigation starting from the point the user just
explored enables users to:
</p>
<ul>
<li>Touch explore an application to understand its screen layout,
</li>
<li>Use muscle memory to quickly touch parts of the display to
access familiar application screens,
</li>
<li>Use linear navigation to reach the desired item when muscle
memory is wrong by a small amount.
</li>
</ul>
<p>
As an example, when using the Google Play Store, I can use muscle
memory with touch exploration to find the <i>Search</i> button in the
top action bar. Having found an application to install, I can
once again use muscle memory to roughly touch in the vicinity of
the <i>Install</i> button; If what I touch is not the <i>Install</i>
button, I can typically find it with one or two linear navigation
steps. Having found the <i>Install</i> button, I can double-tap
anywhere on the screen.
</p>
<p>
The same use case in ICS where we lacked <i>Accessibility Focus</i>
and linear navigation would have forced me to use touch
exploration exclusively. In instances where muscle memory worked
perfectly, this form of interaction was highly effective; but in
our experience, it also tended to lead to breakdowns and
consequent user frustration in instances where users almost found
the control they were looking for but missed by a small amount.
</p>
<p>
Having introduced accessibility focus and linear navigation in
Jelly Bean, we decided to eliminate the ICS requirement that the
user tap on or near a control to activate it — we now enable
users to activate the item with accessibility focus by tapping
<b>anywhere</b> on the screen. To eliminate spurious taps ---
especially on tablets, we made this a double-tap rather than a
single tap. Note: based on user experience, we may optionally
bring back single tap at some point in the future as an end-user
customization.
</p></div>
</div>
<div class='outline-3' id='outline-container-1-3'>
<h3 id='sec-1-3'><span class='section-number-3'>1.3</span> Summary</h3>
<div id='text-1-3' class='outline-text-3'>
<p>
Android Accessibility continues to move forward with Jelly Bean,
and will continue to evolve rapidly along with the
platform. Please use the <a href='https://www.google.com/url?q=http://groups.google.com/group/eyes-free&sa=U&ei=S90OUI28GaariAKq_4CwDA&ved=0CBEQFjAA&usg=AFQjCNHb5JYxWswxySEmj8Rz3RdZ405iOA'>Eyes-Free Google Group</a> to provide
constructive feedback on what works or doesn't work for you ---
what is most effective is to objectively describe a given use
case and your particular experience.
</p>
</div>
</div>
</div>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com306tag:blogger.com,1999:blog-9106705776667990876.post-86499263432203885302012-06-29T17:37:00.001-07:002012-07-02T19:14:20.459-07:00What's
New
In
Google
Accessibility
From
Google
I/O
2012
<div xmlns='http://www.w3.org/1999/xhtml'>
<div id='preamble'>
</div>
<div id='content'>
<h1 class='title'>Google IO 2012: What's New With Google Access
</h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'>
<span class='section-number-2'>1
</span> Google IO 2012: What's New From Google Access
</h2>
<div id='text-1' class='outline-text-2'>
<p>We showcased a number of exciting advances in accessibility on Android and Chrome during IO 2012. With these advances, blind and low-vision users can leverage Google applications and services on Android and Chrome to collaborate effectively with their peers and to obtain on-the-go access. Advances include out-of-the-box access on Android (JellyBean), a new set of gestures that enable fluent interaction on touch-screen devices, Braille support on Android, an extension framework for ChromeVox, and a new, high-quality voice for use with Web applications on Chrome.
</p>
</div>
<div class='outline-3' id='outline-container-1-1'>
<h3 id='sec-1-1'>
<span class='section-number-3'>1.1
</span> Enhanced Android Accessibility In JellyBean:
</h3>
<div id='text-1-1' class='outline-text-3'>
<ul>
<li>Accessibility on Android can be enabled by
long-pressing with two fingers (4 seconds) the setup screen to enable out-of-the-box access for blind users.
</li>
<li>Touch exploration has been enhanced with simple gestures that enable users navigate on-screen contents.
</li>
<li>JellyBean provides a set of Accessibility Actions that can be called from any AccessibilityService such as TalkBack; it also provides early support for Braille displays.
</li>
<li>Touch exploration and gesture navigation both set
<i>AccessibilityFocus
</i> â
</li>
</ul>
<p>double-tapping anywhere on the screen activates the item with
<i>AccessibilityFocus
</i>.
</p>
<ul>
<li>TalkBack now has a sister service
<i>BrailleBack
</i> for providing Braille support on Android.
</li>
</ul>
<ul>
<li>Chrome on Android is now accessible and supports the latest in Web Access standards.
</li>
</ul>
<p>With these enhancements in Android access, blind users can use a combination of touch exploration and navigational gestures to access any part of the Android user interface.
</p>
<p>As an example, I typically use the Android Play Store by touching the screen around the area where I expect a specific control; quick flicks of the finger then immediately get me to the item I want. With these touch gestures in place, I now use touch exploration to learn the layout of an application; with applications that I use often, I use a combination of
<i>muscle memory
</i> and gesture navigation for rapid task completion.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-2'>
<h3 id='sec-1-2'>
<span class='section-number-3'>1.2
</span> Chrome OS On Chrome Books And Chrome Box
</h3>
<div id='text-1-2' class='outline-text-3'>
<p>Chrome OS comes with ChromeVox pre-configured — ChromeVox is our Web Accessibility solution for blind users. With the new high-quality voice that is being released on the Chrome Webstore, ChromeVox now provides smooth spoken feedback in Chrome on all desktop environments. Finally, we announced a flexible extension framework that enables Web developers leverage ChromeVox from within and outside of their own Web applications to provide rich, contextual spoken feedback.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-3'>
<h3 id='sec-1-3'>
<span class='section-number-3'>1.3
</span> Developer Tools For Ensuring Accessibility
</h3>
<div id='text-1-3' class='outline-text-3'>
<p>To help developers better leverage Web Accessibility, we are releasing a new Accessibility Audit tool that enables Web developers detect and fix commonly occuring accessibility errors. This tool has been integrated into Chrome's Developer Tools and helps Web developers ensure accessibility while working within their normal workflow.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1-4'>
<h3 id='sec-1-4'>
<span class='section-number-3'>1.4
</span> Accessibility Related Presentations At Google I/O 2012
</h3>
<div id='text-1-4' class='outline-text-3'>
<p>Catch these on Youtube in the next week if you weren't able to attend I/O this week.
</p>
<ul>
<li>Android Accessibility (T. V. Raman,Peter Lundblad, Alan Viverette and Charles Chen).
</li>
<li>Advanced Web Accessibility (Rachel Shearer, Dominic Mazzoni and Charles Chen).
</li>
<li>What's New In JellyBean: Android Team.
</li>
<li>JellyBean announcement in the Wednesday keynote.
</li>
</ul>
</div>
</div>
</div>
</div>
<div id='postamble'>
<p class='date'>Date: 2012-06-21 Thu
</p>
<p class='author'>Author: T.V Raman
</p>
<p class='creator'>Org version 7.8.11 with Emacs version 24
</p>
<a href='http://validator.w3.org/check?uri=referer'>Validate XHTML 1.0
</a>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com233tag:blogger.com,1999:blog-9106705776667990876.post-47707534690164662642011-08-10T12:01:00.001-07:002011-08-10T12:06:27.998-07:00Accessible GMail On Android ---- Eyes-Free Email On The Go!
<div id='content'>
<h1 class='title'>Accessible GMail On Android — Eyes-Free Email On The Go</h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'><span class='section-number-2'>1</span> Accessible GMail On Android — Eyes-Free Email On The Go </h2>
<div id='text-1' class='outline-text-2'>
<p>
I've been using Android as my primary smart phone since late
2008, and the level of email access I've had on Android in the past has
always been a source of frustration. About a year ago, I first
started accessing my email on Android with K9-Mail — that
helped me bridge some of the accessibility gaps on the platform.
</p>
<p>
Over the last few months, our friends over in GMail Mobile have
been adding accessibility support to the GMail client on
Android. What is truly exciting is that this support is being
added to existing releases of Android including Froyo (Android
2.2) and GingerBread (Android 2.3). This means that GMail on
Android is now accessible on existing devices — get the update
from Market and give it a spin.
</p>
</div>
<div class='outline-3' id='outline-container-1.1'>
<h3 id='sec-1.1'><span class='section-number-3'>1.1</span> Typical Usage Pattern </h3>
<div id='text-1.1' class='outline-text-3'>
<p>
Here is my typical usage pattern when accessing my corporate
email at Google. Note that the volume of email I receive on this
account is extremely high, and includes many mailing lists that
I typically do not read while on a mobile device. To limit how
much email I download to the mobile device, and to ensure that I
attend to the most pressing email messages while on the go I do
the following:
</p>
<ul>
<li>
I have defined a GMail filter that assigns label <span style='text-decoration:underline;'>to-mobile</span>
to messages I want to access when on the go.
</li>
<li>
Typically, this includes email addressed directly to me, and
other priority items.
</li>
<li>
I launch GMail to open to this label.
</li>
<li>
I quickly skim through the list of displayed messages to here
the subject and a quick overview of the message.
</li>
<li>
If I decide to read the complete message, I select that
message via the trackball on my Nexus One to hear the
message in its entirety.
</li>
<li>
And finding an email thread I am looking for is just one
click away — press the search button, and use up/down to
navigate your search history.
</li>
</ul>
<p>See our <a href='http://www.google.com/support/mobile/bin/answer.py?hl=en&answer=1350419&topic=21233'>help center documentation</a> for additional details.
</p>
</div>
</div>
</div>
<div id='postamble'>
<p class='author'> Author: T.V Raman
<a href='mailto:raman@google.com'><raman@google.com></a>
</p>
<p class='date'> Date: 2011-08-10 Wed</p>
<p class='creator'>HTML generated by org-mode 6.30c in emacs 23</p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com379tag:blogger.com,1999:blog-9106705776667990876.post-11597475185450185922011-05-16T17:20:00.001-07:002011-05-16T17:20:37.253-07:00Leveraging Android Access From Google IO 2011
<div xmlns='http://www.w3.org/1999/xhtml'>
<div>
<p>You can watch our <a href='http://www.youtube.com/watch?v=BPXqsPeCneA'>Google IO
2011</a> on Levarging Android Access APIs. The main take-aways
from the talk:</p>
<ul>
<li>Android Access is easy --- the framework does most of the
heavy-lifting.</li>
<li>Implementing Android Access does not mean you take a
performance hit.</li>
<li>Accessibility is really about expanding the reach of your
application.</li>
</ul>
<p>Implementing accessibility within your application and thereby
ensuring that it is usable in a wide variety of end-user
scenarios will benefit your application --- both in terms of
the number of users you gain, as well as how often your users
use your application.</p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com148tag:blogger.com,1999:blog-9106705776667990876.post-87826768398365452442011-03-21T17:42:00.001-07:002011-03-21T17:42:28.035-07:00TalkBack Refreshed: Accessible On-Screen Keyboard And More ...
<div xmlns='http://www.w3.org/1999/xhtml'>
<div id='content'>
<h1 class='title'>Android Access: TalkBack Refreshed</h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'><span class='section-number-2'>1</span> Android Access: TalkBack Refreshed </h2>
<div id='text-1' class='outline-text-2'>
<p>
The latest enhancements to TalkBack now brings Android
Accessibility to devices without a physical keyboard. Many of
these enhancements also improve the overall TalkBack experience
on all devices.
</p>
</div>
<div class='outline-3' id='outline-container-1_1'>
<h3 id='sec-1_1'><span class='section-number-3'>1.1</span> Highlights </h3>
<div id='text-1_1' class='outline-text-3'>
<ul>
<li>
New <i>TalkBack Keyboard</i>.
</li>
<li>
On-screen talking keyboard enables text entry via the touch screen.
</li>
<li>
Text review provides spoken feedback when moving the cursor by character, word, sentence, or paragraph.
</li>
<li>
Virtual D-Pad for navigating the Android user interface.
</li>
<li>
Global TalkBack commands enable one-click access to oft-used commands.
</li>
</ul>
</div>
</div>
<div class='outline-3' id='outline-container-1_2'>
<h3 id='sec-1_2'><span class='section-number-3'>1.2</span> TalkBack Keyboard </h3>
<div id='text-1_2' class='outline-text-3'>
<p>
The <i>TalkBack Keyboard</i> is an Accessible Input Method (Accessible
IME) that when activated enables you to enter and review text via
the touch screen. To use this feature, you need to first
<i>activate</i> the TalkBack keyboard via the <i>Language and Keyboard</i>
option in the <i>Settings</i> menu. Next, customize the <i>TalkBack Keyboard</i> to taste via the <i>TalkBack Keyboard Settings</i> option
--- here, you can customize additional features including
auditory feedback as you type. Finally, open your favorite
editing application, long-press on an edit field, and select
<i>TalkBack keyboard</i> as your default IME. Note that you need do
this only once; once the TalkBack keyboard has been made the
default, it persists across reboots.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1_3'>
<h3 id='sec-1_3'><span class='section-number-3'>1.3</span> Entering Text On The Touch Screen </h3>
<div id='text-1_3' class='outline-text-3'>
<p>
<i>TalkBack keyboard</i> is an on-screen keyboard that supports touch
exploration along with synchronized spoken and auditory
feedback. This means you can now enter text when using devices
that don't sport a physical keyboard.
</p>
<p>
But wait, there's more here than meets the finger at first touch.
Once you have activated the <i>TalkBack Keyboard</i>, you can switch
the keyboard among three states by long-pressing the volume
up/down buttons:
</p>
<dl>
<dt>Hidden</dt><dd>
The <i>TalkBack</i> keyboard is not displayed.
</dd>
<dt>Navigating</dt><dd>
You get access to an on-screen virtual D-Pad, along
with <span style='text-decoration:underline;'>Back</span>, <span style='text-decoration:underline;'>Home</span>, <span style='text-decoration:underline;'>Search</span>, and <span style='text-decoration:underline;'>Menu</span> buttons.
</dd>
<dt>Typing</dt><dd>
An on-screen <span style='text-decoration:underline;'>qwerty</span> keyboard.
</dd>
</dl>
<p>
My preferred means of using the keyboard is to turn on auditory
feedback from within <i>TalkBack Keyboard Settings</i>, as well as
having SoundBack active. In this mode, you hear keys as you
explore the keyboard along with an auditory icon; picking up your
finger types the last key you explored. Typing produces a
distinctive key-click.
</p>
<p>
The on-screen keyboard occupies the bottom 1/3 of your
screen. While entering text, explore and find the top row, then
move above it to hear what you have typed so far.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1_4'>
<h3 id='sec-1_4'><span class='section-number-3'>1.4</span> Reviewing Text By Character, Word, Sentence Or Paragraph </h3>
<div id='text-1_4' class='outline-text-3'>
<p>
You can now navigate and review text by character, word, sentence
or paragraph. Use a two-finger tap to move forward through these
navigation levels; a two-finger double tap moves in the reverse
direction. Once you have selected your preferred mode of
navigation, you can use Up/Down on the physical track-ball/D-Pad,
or alternatively, flick up or down on the virtual D-Pad to move
forward or backward through the text being reviewed.
</p>
<p>
Note that text review works when the <i>TalkBack keyboard</i> is in
either/navigating/ or <i>typing mode</i>; personally, I find it less
error-prone on keyboard-less devices to first switch to
<i>navigating mode</i> when reviewing text, since it is easy to
inadvertently enter spurious text otherwise.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1_5'>
<h3 id='sec-1_5'><span class='section-number-3'>1.5</span> Using The On-Screen Virtual D-Pad </h3>
<div id='text-1_5' class='outline-text-3'>
<p>
Placing the TalkBack keyboard in <i>navigating mode</i> provides an
on-screen virtual D-Pad --- this is especially useful on devices
that do not have a physical D-Pad or track-ball on the front of the
device. When active, the virtual D-Pad occupies the bottom
one-third of the screen, and fast-flicks in that area has the
same effect as moving with a D-Pad or track-ball. Tapping
anywhere within the virtual D-Pad is the same as clicking with
the track-ball.
</p>
<p>
The corners of the virtual D-Pad also provides <span style='text-decoration:underline;'>Back</span>, <span style='text-decoration:underline;'>Home</span>,
<span style='text-decoration:underline;'>Search</span> and <span style='text-decoration:underline;'>Menu</span> buttons --- these are especially useful on
devices that lack explicit physical or capacitive buttons for
these common Android actions. You can explore the virtual D-pad
by moving your finger around the D-Pad area; crossing the
top-edge of this area provides haptic and auditory feedback that
can be used as an orientation aid in finding the virtual buttons
on the corners.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1_6'>
<h3 id='sec-1_6'><span class='section-number-3'>1.6</span> Global Commands </h3>
<div id='text-1_6' class='outline-text-3'>
<p>
In addition, selecting the <i>TalkBack Keyboard</i> as your default
input method enables a set of <i>global commands</i> that can be
accessed from your physical keyboard --- eventually, we will make
these available via the soft keyboard as well. Here are a list of
the current commands:
</p>
<table frame='hsides' rules='groups' cellpadding='6' cellspacing='0' border='2'>
<caption/>
<colgroup><col class='left'/><col class='left'/><col class='left'/>
</colgroup>
<thead>
<tr><th class='left' scope='col'>Command</th><th class='left' scope='col'>Description</th><th class='left' scope='col'>Key</th></tr>
</thead>
<tbody>
<tr><td class='left'>Battery</td><td class='left'>Speaks the current battery level</td><td class='left'>menu + B</td></tr>
<tr><td class='left'>Time</td><td class='left'>Speaks the current date and time</td><td class='left'>menu + T</td></tr>
<tr><td class='left'>Connectivity</td><td class='left'>Speaks the connectivity state of each connection: WiFi, 3G, etc</td><td class='left'>menu + O</td></tr>
<tr><td class='left'>Repeat</td><td class='left'>Repeats the last TalkBack utterance</td><td class='left'>menu + R</td></tr>
<tr><td class='left'>Spell</td><td class='left'>Spells the last TalkBack utterance</td><td class='left'>menu + S</td></tr>
</tbody>
</table>
<p>
These shortcuts are listed in the <i>Accessibility Preferences</i>
application where they can be edited. You can choose between
menu and search for the modifier, and any letter on the keyboard
for the letter.
</p>
</div>
</div>
<div class='outline-3' id='outline-container-1_7'>
<h3 id='sec-1_7'><span class='section-number-3'>1.7</span> Summary </h3>
<div id='text-1_7' class='outline-text-3'>
<p>
All of these features work on Android 2.2 and above. In addition,
TalkBack makes WebView accessible in Honeycomb --- look for a
separate announcement about accessibility enhancements that are
exclusive to the Honeycomb release in the coming weeks.
</p>
</div>
</div>
</div>
<div id='postamble'>
<p class='author'> Author: T.V Raman
</p>
<p class='date'> Date: 2011-03-16 Wed</p>
<p class='creator'>HTML generated by org-mode 7.4 in emacs 24</p>
</div>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com161tag:blogger.com,1999:blog-9106705776667990876.post-4988508466810345372011-01-20T14:35:00.001-08:002011-01-20T14:44:31.087-08:00Eyes-Free Shell Refreshed <div xmlns='http://www.w3.org/1999/xhtml'>
<p>We just refreshed
<em>Eyes-Free Shell
</em> on Android Market with a long-overdue set of improvements that have been waiting to launch. Here is a brief summary of user-visible changes:
</p>
<dl>
<dt>User Customizable Home Screen
</dt>
<dd>
<p>You can now add additional pages of short-cuts to the home screen. You can flip through these pages of short-cuts by tapping the left or right edge of the screen. Pressing the
<code>menu
</code> key within a page of shortcuts allows you to customize the short-cuts on that page; it also provides controls for inserting new short-cut pages.
</p>
</dd>
<dt>One-Click Uninstall
</dt>
<dd>
<p>The default way of managing applications in Android requires many clicks through nested menus --- this is especially true when uninstalling applications. The Eyes-Free Shell now lets you uninstall applications by pressing
<code>menu
</code> while in the
<em>applications list
</em>.
</p>
</dd>
<dt>I18N
</dt>
<dd>
<p> Spanish and Chinese strings for Eyes-Free Shell.
</p>
</dd>
</dl>
<p>And many more underlying changes too numerous to fit in this margin. Speak, Listen, And Enjoy!
</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com110tag:blogger.com,1999:blog-9106705776667990876.post-41191376362586472272011-01-13T14:48:00.001-08:002011-01-13T14:48:40.868-08:00Intersection Explorer --- Now Intersections Sound Even Better
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>We just updated <em>Intersection Explorer</em> on Android
Market.
This version improves on the initial launch by providing more
intuitive descriptions for intersections, I'll include some
examples below:</p>
<dl>
<dt>T-Intersection</dt>
<dd>
<p><em>Minor Street</em> ends in <em>Main Street</em> to
form a T-Intersection Depending on where you explore from, you hear:</p>
<ul>
<li>Currently at <em>Minor Street</em> ends in <em>Main
Street</em></li>
<li>Currently at <em>right on to Minor Street</em> from <em>Main
Street</em>.</li>
<li>Currently at <em>left on to Minor Street</em> from <em>Main
Street</em>.</li>
</ul>
</dd>
<dt>Plus-Intersection</dt>
<dd><p>Given the 4-way intersection of <em>Castro Street</em>
and <em>El Camino</em>, you hear one of the following depending
on the direction you're exploring:</p>
<ul>
<li>Currently at <em>Castro Street</em> crosses <em>El
Camino</em></li>
<li>Currently at <em>El Camino</em> crosses <em>Castro
Street</em>.</li>
</ul>
</dd>
</dl>
<p>And a lot more than will fit this margin --- explore, share
and enjoy!</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com38tag:blogger.com,1999:blog-9106705776667990876.post-8114944106308424762010-10-08T14:53:00.001-07:002010-10-14T14:49:19.486-07:00Walking About With A Talking Android<div xmlns='http://www.w3.org/1999/xhtml'><div id='content'><h1 class='title'>Walking About With A Talking Android </h1><div class='outline-2' id='outline-container-1'><h2 id='sec-1'><span class='section-number-2'>1</span> Walking About With a Talking Android </h2><div id='text-1' class='outline-text-2'><p>I have long relied on spoken directions from <a href='http://googleblog.blogspot.com/2006/12/speech-friendly-textual-directions.html'>Google Maps</a> on the desktop. As I access more and more of my online world through my Android phone, Google's recent announcement of <a href='http://googlemobile.blogspot.com/2010/09/walk-this-way.html'>GMM4.5</a> enhanced with walking directions means that I now have superior functionality to what I have enjoyed at my desk --- but now with the added benefit of having it all in my pocket!</p><p>Inclusion of step-by-step walking directions on Android now allows me to specify a destination on my <a href='http://google-opensource.blogspot.com/2009/10/talkback-open-source-screenreader-for.html'>TalkBack</a> enabled<a href='http://eyes-free.googlecode.com'>eyes-free</a> Android device, and have these spoken to me as I walk. But wait, there's more!</p><p>We're launching a new member of our Eyes-Free family of programs for Android --- WalkyTalky that goes hand-in-hand with spoken walking directions from Google Maps to better navigate the physical world. In addition,application Intersection Explorer allows me to explore the layout of streets using touch before venturing out with WalkyTalky.</p></div><div class='outline-3' id='outline-container-1_1'><h3 id='sec-1_1'><span class='section-number-3'>1.1</span> WalkyTalky </h3><div id='text-1_1' class='outline-text-3'><p>WalkyTalky is an Android application that speaks the address of nearby locations as you pass them. It also provides more direct access to the walking directions component of Google Maps. With WalkyTalky installed, you can:</p><ul><li>Launch WalkyTalky to specify a destination,</li><li>Either specify the destination by address, or pick from favorites or recently visited locations,</li><li>And in addition to spoken walking directions,</li><li>Hear street addresses as you walk by.</li></ul><p>These spoken updates, in conjunction with the walking directions that are spoken by Google Maps help me navigate the physical world as efficiently as I navigate the Internet. </p></div></div><div class='outline-3' id='outline-container-1_2'><h3 id='sec-1_2'><span class='section-number-3'>1.2</span> Intersection Explorer </h3><div id='text-1_2' class='outline-text-3'><p>Often, I like exploring a neighborhood to learn the layout of the streets before actually venturing out with my trusty companion,<a href='http://emacspeak.sourceforge.net/raman/hubbell-labrador/hubbell.jpg'>Hubbell Labrador</a>, and this is where Intersection Explorer comes into its own. Using this application, I can explore any neighborhood on Google Maps via touch exploration.</p></div><div class='outline-4' id='outline-container-1_2_1'><h4 id='sec-1_2_1'><span class='section-number-4'>1.2.1</span> How It Works </h4><div id='text-1_2_1' class='outline-text-4'><ul><li>Intersection Explorer starts off at the user's current location.</li><li>One can change the start position by entering an address, to do this, press <i>menu</i> and click on <i>new location</i>.</li><li>Once the map has loaded, touching the screen speaks the streets at the nearest intersection.</li><li>Moving one's finger along a compass direction, and then tracing a circle speaks each street at that intersection along with the associated compass direction.</li><li>Presence of streets is cued by a slight vibration as one traces the circle.</li><li>Lifting up the finger when on a street moves in that direction to the next intersection, speaks the distance moved, and finally speaks the newly arrived-at intersection.</li></ul></div></div></div><div class='outline-3' id='outline-container-1_3'><h3 id='sec-1_3'><span class='section-number-3'>1.3</span> Summary </h3><div id='text-1_3' class='outline-text-3'><p>Together, Intersection Explorer and WalkyTalky, in conjunction with Walking Directions from Google Maps brings a new level ofaccess to my physical world. I use these tools in conjunction with other Maps-based applications such as the Places Directory on Android --- this is another application from the Google Maps team that works fluently with TalkBack on Android to help me find nearby attractions or other locations of interest.</p><p>So next time you take your trusty Android out for a walk, make sure to give these new tools a spin --- you can report back on your experience via our <a href='http://eyes-free.googlegroups.com'>Eyes-Free Group</a>.</p><p><blockquote>Applications WalkyTalky and Intersection Explorer can be downloaded from the Android Market.Share And Enjoy, and as usual, remember, The Best Is Yet To Come!</blockquote></p></div></div></div><div id='postamble'><p class='author'> Author: T.V Raman</p><p class='date'> Date: 2010-09-09 Thu</p><p class='creator'>HTML generated by org-mode 7.01 in emacs 24</p></div></div> </div><br /><br /><p><br />QR Code for WalkyTalky:<br /><img src="http://chart.apis.google.com/chart?cht=qr&chs=135x135&chl=market%3a%2f%2fdetails%3fid%3dcom.googlecode.eyesfree.walkytalky" alt="QR code for WalkyTalky"></img><br /><br><br />QR Code for Intersection Explorer:<br /><img src="http://chart.apis.google.com/chart?cht=qr&chs=135x135&chl=market%3a%2f%2fdetails%3fid%3dcom.google.android.marvin.intersectionexplorer" alt="QR code for Intersection Explorer"></img><br /></p>T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com346tag:blogger.com,1999:blog-9106705776667990876.post-77466028830059042082010-09-09T08:45:00.001-07:002010-09-09T17:15:31.285-07:00TalkBack, Eyes-Free Shell Refreshed --- Now With End-User Settings $
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>We are pushing out a series of updates via Android Market for
TalkBack and the Eyes-Free Shell. Here is a brief overview of
end-user visible changes.
</p>
<h2>Accessibility Preferences
</h2>
<p>Going by the principle of
<em>things should just work as expected
</em>, we have long resisted giving in to having a complex set of user preference settings for TalkBack and friends --- in my experience, if you introduce such a settings menu early on, we as software engineers tend to punt on all complex decisions by turning each question into a complex user-facing dialog. That said, it is now time to gradually introduce end-user settings for some aspects of the various accessibility tools.
</p>
<h2>Accessibility Preferences
</h2>
<p>Welcome new application
<em>AccessibilityPreferences
</em> to Android. What this application does:
</p>
<ul>
<li>From an end-user perspective, it provides a single place where you will find preference settings corresponding to each accessibility tool you have installed on your phone.
</li>
<li>For developers of accessibility tools, it provides a simple means of registering a custom program for managing end-user preferences for that tool.
</li>
</ul>
<p>TalkBack installs its user preferences under this tool. You can tweak a number of settings that affect TalkBack behavior including:
</p>
<ul>
<li>Control whether TalkBack speaks when the screen is off --- useful to silence status messages when you have the phone turned off.
</li>
<li>Control whether TalkBack speaks when ringer volume is set to 0, i.e., phone is in silent mode.
</li>
<li>Control whether the proximity sensor is used to shut off speech.
</li>
</ul>
<p>Over time, we'll add more settings here as appropriate --- but expect us to be conservative with respect to how many settings show up.
</p>
<h2>Updates To The Eyes-Free Shell
</h2>
<p>Here is a summary of updates to the Eyes-Free Shell:
</p>
<ul>
<li> Changes the proximity sensor logic so that it is only active when the shell is active; this should be more battery efficient
</li>
<li>Fixes a race condition bug that can trigger when the shell is being exited as an application is being installed/removed
</li>
</ul>
<h2>TalkBack
</h2>
<p>Here is a summary of changes to TalkBack:
</p>
<ul>
<li>TalkBack now includes application-specific speech strategies for some popular applications. This provides context-sensitive spoken feedback.
</li>
<li>Applications that have such speech strategies defined include Facebook, Stitcher and GoogleVoice amongst others.
</li>
<li>Implements a settings screen that can be used with Accessibility Preferences
</li>
<li>Available settings:
<ol>
<li>Ringer Volume (Speak at all ringer volumes, No speech in Silent Mode, No speech in Vibrate and Silent Mode)
</li>
<li>Screen Status (Allow speech when screen is off, No speech when screen is off)
</li>
<li>Speak Caller ID (checked/not checked)
</li>
<li>Proximity Sensor (checked/not checked)
</li>
</ol>
</li>
</ul>
<p>In addition, TalkBack introduces the ability to add application-specific plugins --- expect to see more advancement here in future releases.
</p>
<h2>AccessibilityPreferences Hints For Developers
</h2>
<p>If you're a developer of an AccessibilityService, you need to:
</p>
<ul>
<li>Implement a preferences screen for your application.
</li>
<li> Implement this with intent filter:
<pre>
<br/> <intent-filter>
<br/> <action android:name="android.intent.action.MAIN" />
<br/> <category android:name="android.accessibilityservice.SERVICE_SETTINGS" />
<br/> </intent-filter>
<br/>
</pre>
</li>
</ul>
<p>Share And Enjoy,
</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com46tag:blogger.com,1999:blog-9106705776667990876.post-62248021567483829342010-08-24T14:47:00.001-07:002010-08-24T14:47:16.707-07:00Eyes-Free Review: Droid2 From MOT
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>
Here is a quick eyes-free access overview of the MOT Droid2.
</p>
<h3>Hardware</h3>
<ol>
<li> The device has a pull-out keyboard, and the buttons are much
more tactile than the original Droid. </li>
<li>The device also has
dropped the hard-to-use D-Pad from the original Droid in favor of
PC-style arrow keys. </li>
<li>There is once again no dedicated number
row at the top. </li>
<li>The capacitive buttons on the front of the device appear in a
different order from the original Droid --- with the device in
portrait mode, reading left to right you have: Menu, home, back,
and search.</li>
<li>In addition, MOT ships a voice search application on the device
that is triggered by pressing a special <em>microphone button</em> --
it's worth learning the position of this key, since voice-search
can be useful --- and more importantly, if you're relying on
spoken feedback, hitting this button leads to the phone falling
inexplicably silent.</li>
</ol>
<h3>Software</h3>
<p>
If you look under accessibility, you'll find an application
called Voice Readouts from MOT. This appears to be a screenreader
analogous to TalkBack, though in my experience, it did not
produce spoken feedback in many instances. That said, this
application collaborates well with TalkBack --- and after
installing TalkBack from the Android Market (note: the Droid2
does not come with TalkBack bundled) -- you can activate both
TalkBack and VoiceReadout for an optimal experience.</p>
<p>
VoiceReadout appears to have a preliminary version of
touch-exploration. With VoiceReadout active, a single tap speaks
the item under the finger; a double-tap activates that
item. Note that moving the finger around on the display does not
appear to trigger touch exploration; also, touch exploration
appears to be available in only some contexts.
</p>
<h4>Instances where touch exploration appears to be active</h4>
<ol>
<li> Settings application.</li>
<li> Portions of Android Market.</li>
</ol>
<p>In general, touch exploration appears to be available in
ListView.</p>
<p>
In addition, the Droid2 also includes a low-vision accessibility
tool called Zoom Mode ( look for it under Settings ->
Accessibility ) this tool provides a magnification lens.
</p>
<h3>Summary</h3>
<p>All in all, the Droid2 appears to be one of the better choices
for eyes-free use from among the presently available crop of
Android phones. Touch exploration, though preliminary, is nice to
see on the platform, and the bundled low-vision magnification aid
is a nice touch. Voice Readouts is also a great example of an
Android accessibility service done right in that it co-exists
peacefully with other screenreaders like TalkBack to provide an
optimal end-user experience. To users not familiar with adaptive
technologies in general, this might not sound like a big deal ---
but users of PC screenreders have long been familiar with the
need to have only one screenreader turned on. As we transition to
modern platforms like Android, it's useful to remind ourselves
that screenreaders can in fact co-exist, with each tool providing
something useful to create an overall experience that is greater
than the sum of the parts.</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com18tag:blogger.com,1999:blog-9106705776667990876.post-43746236964780398142010-07-12T09:55:00.001-07:002010-07-12T09:55:27.788-07:00Welcoming Loquendo Susan To Android (FroYo)
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>
Android 2.2 (AKA FroYo) introduces many platform enhancements,
and one that I find particularly relevant is the ability to
plug-in additional Text-To-Speech engines. What this means from
an end-user point of view:</p>
<ul>
<li>Android comes with a set of built-in voices since Android 1.6
--- these are the <em>Pico</em> voices for English, French, Italian,
German and Spanish.</li>
<li>With the Text-ToSpeech plug-in mechanism in place, we can now
add new engines to the platform.</li>
<li>The first such add-on was ESpeak, which brings support for
many of the world's languages.</li>
<li>And now, vendors are able to sell high-quality add-on voices
via the Android Market.</li>
<li>Loquendo Susan is the first commercially available voice for
Android. Users running FroYo can buy this voice on the Android
Market. Thanks to the plug-in mechanism, once you buy a new
voice, you can switch all your talking applications to use the
newly installed voice --- see instructions below.</li>
</ul>
<h2>Activating And Using Newly Installed Voices</h2>
<p> Goto <code>Settings → Voice Input And Output →
Text To Speech Settings</code>. First, activate the newly
installed voice by clicking the corresponding checkbox item for
that voice. Next, go to <code>Default Engine</code> in the
<code>Text To Speech Settings</code> menu, and make the newly
installed voice your default engine. Finaly, if you want all
applications to use the new voice, check option <code>Always use
my settings</code></p>
<p>With this in place, my Nexus and Droid both speak using
Loquendo Susan --- thus turning my Android into into a truly
pleasant eyes-free device.</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com14tag:blogger.com,1999:blog-9106705776667990876.post-72587039979822962332010-05-25T08:41:00.001-07:002010-06-10T17:49:05.472-07:00Stitcher And TalkBack: The World In My Ears
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>Shortwave Radio --- and DXing was one my hobbies growing
up--- I spent many hours listening to far-off radio stations
---and in the process developed a love for languages. Fast
forward to the late 90's, and one could now listen to radio
stations from all around the world on the Internet --- but this
time without
the hiss and static of shortwave propogation. But there was a
catch --- you needed to be at your computer to listen to these
stations. At home, I solved this problem by setting up a set of
living room speakers connected to the computer in my
office-bedroom; with a wireless keyboard, this brought Internet radio to my living room.
</p>
<p>Fast-forward to the next decade, and I now have the Internet
in my pocket in the form of a smart phone. I recently discovered
<em>Stitcher
</em> on the Android Market --- and it got me the final mile to
having
ubiquitous access to Internet Radio!
</p>
<h3>Using Stitcher With TalkBack
</h3>
<p>There is little more to say other than
<em>try it out!
</em>.Stitcher on Android is a simple Android application that
worksout of the box with TalkBack. Once you install stitcher
fromMarket, use the arrow keys or trackball on your phone to
browse through the various categories. Clicking on stations launchesplayback immediately. Note that for now, the
<em>stop
</em> buttonin the player is not navigable by the trackball --- I
have gotten used to hitting it by dead-reckoning since it always
appears in afixed position. In the last few weeks,
<em>stitcher
</em> hasreplaced
<em>StreamFuriously
</em>, my previous Internet Radio solution on Android.
</p>
<p>So here's to happy listening!A brief note on the title of this
post ---
<a href='http://catalogue.nla.gov.au/Record/1370120'>The World In My Ears
</a> was also the title of abook on DXing by Arthur Cushen from
New Zealand --- I remember hearing his voice in the 80's on the BBC's World Service.
</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com11tag:blogger.com,1999:blog-9106705776667990876.post-35509689994521490092010-05-20T08:10:00.001-07:002010-05-20T08:10:26.594-07:00An Eyes-Free View Of Android At The Google IO Sandbox
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>Google IO 2010 is playing home to over 5,000 attendees in San
Francisco this week. A number of Google Access engineers are at
the conference consuming and producing information --- here is a
brief view of some of the exciting bits seen on the Android show
floor from an eyes-free perspective.</p>
<h3> Hardware And New Devices From An Eyes Free Perspective</h3>
<p>Many of the phone manufacturers were showing off their latest
devices on the show floor --- visit the <em>Android Sandbox</em>
at Google IO to see these first hand. Charles and I walked
through the various displays Wednesday (May 19) afternoon to test
drive these devices first-hand --- given the large number of
Android devices coming out every week, this was a unique
opportunity to see many of these devices for the first
time. Here are some highlights:</p>
<ul>
<li> All devices were running Android 1.6 or later, and
consequently, <em>Settings/Accessibility</em> was available on
<strong>every</strong> device. Having worked on this for the last
2 years, it's extremely gratifying to see phone manufacturers
including <em>accessibility</em> in their devices.</li>
<li>We found one device from Motorola where we couldn't find the
accessibility setting --- the booth representative promised to
check after we pointed this out --- waiting to hear back.</li>
<li>My favorite device was the LG Ally --- check this device out
if you get a chance.
<ul>
<li> Device to be sold by Verizon.</li>
<li>Device has an elegant tactual feel.</li>
<li>Front of the device sports hardware answer/hangup
buttons.</li>
<li>The pull-out qwerty keyboard is a pleasure to use --- I would
rate this one of the best designed cell phone keyboards I've
seen.</li>
</ul>
</li>
<li>Android devices continue to show up in many shapes and sizes
--- re-emphasizing that there is a device for everyone. This
makes it even more important to choose a device that meets your
particular needs.</li>
</ul>
<h3>Software --- Android Applications Galore</h3>
<p> We also visited the various vendors showing off their latest
Android applications. What was gratifying was that even though
most of these developers had paid little thought to eyes-free
use --- and were blissfully unaware of the existence of an
Android Accessibility API, their applications worked for the most
part with Accessibility enabled. Where there were gaps, we were
able to show developers what they needed to do --- everyone was
extremely receptive. Below is a brief summary of what we saw ---
and a shout-out to all the friendly developers we
met:</p>
<dl>
<dt>Where</dt>
<dd><p>This is a very accessible application I have been using
for a while --- the developers were thrilled to hear that it was
accessible since they had made no special effort.</p>
</dd>
<dt>Aloqua</dt>
<dd><p>A competing application to <em>Where</em> with a very
slick visual UI. This application doesn't raise the appropriate
Access Events at present because it's a custom UI. When we first
talked to their lead developer he was extremely hesitant saying
<q>I dont want to change my custom UI</q>. However,
I could hear his face light up when we said <q>You dont need to
change your look and feel --- you just need to set a couple of
custom Java properties </q> (specifically, property <code>ContentDescription</code></p>
</dd>
<dt>Pandora</dt>
<dd><p>Another favorite of mine that works well with access ---
except --- the player controls are unlabeled. I showed them the
application in action on my Droid --- looking forward to seeing
this application become even more usable.</p></dd>
<dt>NPR News</dt>
<dd><p>There are many NPR tools on the Android Market --- NPR
News is the <em>official</em> application.
The application was originally written by a Googler and Open
Sourced --- I have been using it for about 4 months and it's
completely accessible. It could do with some power-user shortcut
keys to make it even more efficient.</p>
</dd>
<dt>MLB At Bat</dt>
<dd><p> I had originally played with this application during last
year's World Series; at the time, the application was quite
usable with TalkBack. I'm happy to report that nothing has
regressed --- the application still continues to to work well,
except for a couple of glitches with unlabled player
controls.
The booth representatives had actually heard of accessibility ---
and were receptive to fixing the remaining issues.</p>
</dd>
</dl>
<p>Summary: The light-weight design of the Android Access layer
has proven valuable in making sure that it makes it on to
<strong>every device</strong>. The minimal set of
responsibilities the API places on developers has meant that a
large number of Android applications are accessible out of the box.</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com17tag:blogger.com,1999:blog-9106705776667990876.post-48057379657633252382010-05-18T17:38:00.001-07:002010-05-18T17:44:54.131-07:00Audio Books On Android --- Thanks Librivox!
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>In my
<a href='http://eyes-free.blogspot.com/2010/05/using-android-market-eyes-free.html'>previous
article
</a>, I alluded to an
<em>Audio Books
</em> application forAndroid. I did not go into much detail on the application itselfbecause I felt it deserved an article of its own.So heregoes!
</p>
<h2>In Praise Of Librivox
</h2>
<p>If you aren't familiar with the Librivox project, please visit
<a href='http://www.librivox.org'>Librivox.org
</a>to see the wonderful work that that project is doing. Androidapplication
<em>AudioBooks
</em> brings the wonders of Librivoxto Android --- now, you can carry all 30,000 audio books andcounting in your pocket and access them
<strong>anywhere
</strong>.Here are some highlights:
</p>
<ul>
<li>Browse, and quickly play available audio books. You canbrowse by several criteria.
</li>
<li>Books you listen to get downloaded to your device and areavailable for offline listening.
</li>
<li>All books provide a table of contents, allowing you to jumpto a specific portion of a book.
</li>
<li>90% of the application user interface is completelyaccessible with TalkBack --- see below for missing accessfeatures.
</li>
</ul>
<p>The only glitche with using application
<em>AudioBooks
</em>with the Android Access API is that the player controls withinthe audio-book player are presently missing
<em>contentdescriptions
</em> --- this is Android-API speak to say that thecontrols are images with missing labels. So the first time youuse this app, you'll need someone to tell you the buttons ---alternatively just experiment to discover theirfunctions. There are pause, play, rewind and forward buttons ---if the friendly folk who developed this application stumble uponthis post, please get in touch, and I can show you what you needto add to your code to make the eyes-free experience evensmoother.
</p>
<p>Happy Listening --- And Share And Enjoy!
</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com198tag:blogger.com,1999:blog-9106705776667990876.post-6462599193779724092010-05-17T15:12:00.001-07:002010-05-17T15:12:08.565-07:00Using Android Market Eyes-Free
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>The Android Market is a treasure-trove of applications ---
many of which work out of the box with Android's Access API, and
as a result, the freely available screenreaders on the
platform. Working with Market can be initially daunting, given
the large collection applications; additionally, there are a
couple of spots in the workflow that need access
improvements. While we get those fixes pushed, here is a
step-by-step overview of using Android Market with TalkBack,
including the work-arounds for moving over some of the
afore-mentioned hurdles.</p>
<h2>Android Market: A Brief Overview</h2>
<p>Rather than giving a detailed explanation of all of Android
Market's user interface, I'll sketch my day-to-day mode of using
Market --- personally, I find task-oriented help guides far more
usable.</p>
<dl>
<dt>Task: Find Application</dt>
<dd>
<ul><li>I typically launch Android Market from within the
<em>Applications</em> list in the Eyes-Free shell.
On my Droid, I typically do this with the keyboard
already opened since I know I'll be typing very soon.</li>
<li>I press the <em>Search</em> capacitive button on the
bottomright of the display to bring up the search tool. Note that
Market can sometime take a few seconds to launch depending on
your network --- TalkBack should announce <em>Market</em> when
it's ready.</li>
<li>Type a search query --- as an example, try <em>audio
books</em></li>
<li> Use the D-Pad arrow keys (up/down) to navigate the list of
results. TalkBack speaks each entry as you move through the
list.</li>
<li>Find one you like; for this example, we'll use one of my
favorite Market applications --- AudioBooks from project
Librivox.</li>
<li>Press the <em>Enter</em> key on the keyboard to open this
application</li>
<li>This takes you to a screen that lists a short description,
and comments from various users on the application. The
<em>install</em> button is on the bottom of this screen.</li>
<li>And here comes the sticking point in the Market UI that
we're working on fixing; when you cursor through this list, you
dont always get to the <em>install</em> button.But no fear, you
can still install the application!</li>
<li>While we work on creating and pushing the fix for the above,
I typically install applications by tapping the screen where the
<em>install</em> button appears. The bad news is that Ipresently
do this by dead reckoning; the good news is that the
<em>install</em> button always appears at a consistent spot. The
easiest way to learn to do this is to have someone put your
finger on the button the first time, and then learn its position
relative to the pull-out keyboard. While we know that this is not
an ideal eyes-free experience, this little trick opens up a
treasure-trove of applications.</li>
<li>Tap the <em>install</em> button, and you come to the
<em>permissions</em> screen. Cursor to the <em>OK</em> button,
and press <em>Enter</em> Depending on the layout of that screen,
you may once again need to use dead-reckoning. At this point, I
routinely click those on-screen buttons, rather than wasting time
attempting to cursor to the button.</li>
<li>And voila, the <em>AudioBooks</em> application should
download and install!</li>
</ul>
</dd>
<dt>Task: Browse Market</dt>
<dd>
<p>In addition to searching, you can also browse the Market for
available applications, use the cursor keys on the D-Pad for
browsing. Once selected, installing an application follows the
same workflow as above.</p>
</dd>
</dl>
<h2>And The Best Is Yet To Come</h2>
<p>
Once installed, you can try out the application
by pulling down the status bar.
Look for the next posting in this series for details on using
application AudioBooks --- it is one of my all time Market favorites.</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com42tag:blogger.com,1999:blog-9106705776667990876.post-39435244015806497932010-02-25T15:34:00.001-08:002010-02-25T15:43:00.690-08:00Eyes-Free: TalkBack And Shell Improvements
<div xmlns='http://www.w3.org/1999/xhtml'><p>Here is a brief
summary of updates to Android's eyes-free tools--- including
TalkBack, and the Eyes-Free Shell from the last two
weeks.</p><h3>TalkBack</h3><ul> <li> Speech during a phone
call is now re-enabled.</li> <li> Turning the screen on/off is
spoken. This announcement includes the ringer mode/volume.</li>
<li> Changes in the the ringer mode - silent, vibrate, and
normal are now announced.</li> <li> Unlocking the phone is
announced.</li> <li> Other Android applications can
programmatically discover if TalkBack is
enabled.</li></ul><h3>Eyes-Free Shell</h3><p>Now that
applications can programmatically discover whetherTalkBack has
been enabled, configuring Eyes-Free shell to become your
default home screen has become a lot easier. In a nutshell,if
you are a TalkBack user and install Eyes-Free shell, hitting
the <code>Home</code> button will bring up the eyes-free shell,
---no configuration needed. Note that you can always get to the
default Android home screen by long-pressing the <code>Back</code> button.</p><p><strong>Share And Enjoy</strong></p> </div> T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com21tag:blogger.com,1999:blog-9106705776667990876.post-73769731063547567052010-02-12T14:24:00.001-08:002010-02-12T14:24:16.154-08:00Eyes-Free Updates: Marvin And TalkBack Simplified
<div xmlns='http://www.w3.org/1999/xhtml'>
<p>We routinely push updates to our access tools on Android; users
get these updates automatically via Android Market updates. We
just pushed out updated versions of TalkBack, our Open Source
screenreader for Android, and Marvin, the Eyes-Free shell. Here
is a brief summary of these updates:</p>
<ul>
<li>Android applications can now programmatically discover if
TalkBack is running, thanks to the latest changes in
TalkBack. From an end-user perspective, this means that you no
longer need to configure Eyes-Free shell via EyesFreeConfig to
be the default home. If you run TalkBack, and have EyesFree
Shell installed, then pressing <em>Home</em> automatically gives
you the EyesFree Shell.
Remember, you can always get to the default Android Home by
long-pressing <em>Back</em>.</li>
<li>EyesFree Shell now includes a touch-based shortcuts
manager. Until now, shortcuts needed to be explicitly configured
by editting an XML file on the SDCard. With the recent EyesFree
update, you can interactively define short-cuts via a touch-based
ShortCuts manager. By default, we have assigned shortcut
<em>1</em> to the ShortCuts manager; so to invoke this new
feature, do:
<ol>
<li>Stroke left (<em>4</em> using stroke dialer notation) to
enter the shortcuts screen.</li>
<li>Stroke up and to the left (<em>1</em> using stroke-dialer
notation) to invoke application ShortCuts Manager.</li>
<li>Use the trackball/D-Pad to configure each of the 8 available
shortcuts.</li>
</ol>
</li>
</ul>
<p><strong>Marvin: We hope this gives some minimal relief to the pain
in all the diodes on your left side.</strong></p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com3tag:blogger.com,1999:blog-9106705776667990876.post-20298324372538799242010-02-08T14:24:00.001-08:002010-02-08T14:24:55.635-08:00Silencing Speech With A Wave Of Your Hand On Android 2.0
<div xmlns='http://www.w3.org/1999/xhtml'>
<h2>Update To Android Access: TalkBack</h2>
<p>Smart phones tend to be short on physical buttons --- even
devices like the G1 or MotoRola Droid have very few buttons when the
physical keyboard is not open. This provides interesting
challenges when designing an efficient eyes-free interface ---
especially given the old maxim <em>Speech is silvern, but silence
is golden!</em>.Said differently, once you have built a system
that talks back, the first thing you want to build is an
efficient means of silencing spoken feedback.</p>
<p>Early versions of TalkBack on Android skimmed by without a
stop speech button --- you basically moved from one activity to
another,and the speech produced by the new activity effectively
stopped ongoing spoken output. However, as we make more and more
applications work seamlessly with our Access APIs, it's always
been clear to us that we need a global <em>stop speech
gesture</em>! Notice that I said <em>gesture</em> --- not
<em>key</em> --- stopping speech is a critical function that we'd
like to enable without having to pull out the physical keyboard,
and something we'd like to have devices without a physical
keyboard.</p>
<p>In the spirit of <em>the dual to every access challenge is an opportunity to
innovate</em>, we recently launched a new experimental TalkBack
feature on devices running Android 2.0. Devices on the Android
2.0 platform have a <em>proximity sensor</em> on the top front
left corner of the phone --- this is typically used to lock the
screen when you're holding the phone up to your ear when on a
phone call. As the name implies, the <em>proximity
sensor</em>fires when you get close to it --- you can activate it
by waving your hand close to the top left corner of the phone. As
an experimental feature, we have configured the latest version of
TalkBack to silence ongoing speech if you wave your hand in
front of the proximity sensor.</p>
<p>Note that this is a new, experimental feature --- it's
something that we welcome feedback on our public <a href='http://groups.google.com/group/eyes-free/'>Eyes-Free Google
Group</a>. We'd like to know if you accidentally activate
<em>stop speech</em>because of this new feature. In having used
it for a few weeks, I find that I am not triggering it
accidentally --- but that might well be a function of how I hold
the phone.</p>
<h2>What Devices Does This Available On?</h2>
<p>Note that at the time of writing, the devices that have a
proximity sensor that I have used this on include:</p>
<ul>
<li>MotoRola Droid from Verizon</li>
<li>Google NexusOne</li>
</ul>
<p>Note that the G1 and other older Android devices did not have
a proximity sensor.</p>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com8tag:blogger.com,1999:blog-9106705776667990876.post-1102069122335942642010-01-22T11:30:00.001-08:002010-01-22T11:30:34.545-08:001Vox --- Your Query Is Our Command
<div xmlns='http://www.w3.org/1999/xhtml'>
<h1 class='title'>Video: 1Vox --- Your Query Is Our Command </h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'>1 Video: 1Vox --- Your Query Is Our Command!</h2>
<div id='text-1'>
<p>Device Used: Motorola Droid on Verizon</p>
<p>
Speech interface designers often express surprize at the the fact
that the average blind user rarely if ever uses spoken input. But
when you come down to it, this is not too surprizing --- given
that the eyes-free user has speech output active, the overall system ends up talking to itself!
</p>
<p>
To show that these conflicts can be avoided by careful user-interface design,
we demonstrate 1Vox --- our voice-search wizard for the Marvin Shell.
</p>
<ol>
<li>
You activate 1Vox by stroke <u>9</u> on the Marvin screen.
</li>
<li>
You hear a spoekn prompt <i>Search</i>
</li>
<li>
You hear a little auditory icon when the system is ready
for you.
</li>
<li>
You speak oft-used queries e.g., <i>Weather Mountain View</i>.
</li>
<li>
You hear a short spoken snippet in response.
</li>
</ol>
<p>We called this widget 1Vox --- in honor of the Google onebox found
on the Google Results page.
</p>
</div>
</div>
<div id='postamble'><p class='author'> Author: T.V Raman
<a href='mailto:raman@google.com'><raman@google.com></a>
</p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com16tag:blogger.com,1999:blog-9106705776667990876.post-2935752082214569162010-01-22T11:28:00.001-08:002010-01-22T11:28:42.831-08:00YouTube And TalkBack --- Entertainment On The Go
<div xmlns='http://www.w3.org/1999/xhtml'>
<h1 class='title'>Video: TalkBack And YouTube</h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'>1 Video: TalkBack And YouTube</h2>
<div id='text-1'>
<p>Device: Motorola Droid on Verizon</p>
<p>
This video demonstrates searching for and playing YouTube videos with TalkBack providing spoken feedback at each step in the interaction.
</p>
<ol>
<li>
Launch YouTube from the Marvin Application launcher.
</li>
<li>
The trackball can be used here to move through the list of videos.
</li>
<li>
Pressing down on the trackball launches the selected video.
</li>
<li>
Press <u>menu</u> key to enter the YouTube application menu.
</li>
<li>
Click on <i>Search</i> with the trackball.
</li>
<li>
Type a query into the edit field. TalkBack speaks as you type.
</li>
<li>
Press <u>Enter</u> to perform the search.
</li>
<li>
Scroll the results list with the track-ball.
</li>
<li>
Click a desired result to start playing the video.
</li>
</ol>
</div>
</div>
<div id='postamble'><p class='author'> Author: T.V Raman
<a href='mailto:raman@google.com'><raman@google.com></a>
</p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com3tag:blogger.com,1999:blog-9106705776667990876.post-74016979342286559272010-01-22T11:12:00.001-08:002010-01-22T11:12:14.430-08:00Using TalkBack With Google Maps
<div xmlns='http://www.w3.org/1999/xhtml'>
<h1 class='title'>Video: TalkBack And Google Maps </h1>
<div class='outline-2' id='outline-container-1'>
<h2 id='sec-1'>1 Video: TalkBack And Google Maps </h2>
<div id='text-1'>
<p>Device Used: Motorola Droid On Verizon</p>
<p>
TalkBack provides spoken feedback as you use Google Maps.
In this video, we will demonstrate typical maps tasks such as:
</p>
<ol>
<li>
Launch Google Maps using the Marvin application launcher.
</li>
<li>
From within the Maps application, press the <u>menu</u> key.
</li>
<li>
Select <i>Search</i> and type a query into the search field.
</li>
<li>
Notice that I can type a partial query and have
auto-completion based on previous searches.
</li>
<li>
Press <u>Enter</u> to perform the search.
</li>
<li>
Bring up the result list in <i>ListView</i> by touching the
bottom left of the screen.
</li>
<li>
Scroll through this list using the D-Pad.
</li>
<li>
Click with the D-Pad (or enter) to select a business.
</li>
<li>
Scroll through available options, and click <i>Get Directions</i>.
</li>
</ol>
<p>10.Click the <i>Go</i> button to get directions.
</p><ol>
<li>
Scroll with the trackball to hear the directions spoken.
</li>
</ol>
<p>In addition, you can also use Google Latitude to locate your
friends.
</p>
<p>
Note that other Map tools such as Google Latitude are accessible
from within the set of options that appear when you press the
<u>menu</u> key.
</p>
</div>
</div>
<div id='postamble'><p class='author'> Author: T.V Raman
<a href='mailto:raman@google.com'><raman@google.com></a>
</p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com9tag:blogger.com,1999:blog-9106705776667990876.post-79961283141139470862010-01-22T11:04:00.001-08:002010-01-22T11:08:46.426-08:00TalkBack: An Open Source Android Screenreader
<div xmlns='http://www.w3.org/1999/xhtml'>
<h1 class='title'>Video: Introducing TalkBack, An Open Source Screenreader
</h1>
<div id='outline-container-1' class='outline-2'>
<h2 id='sec-1'>1 Video: Introducing TalkBack, An Open Source Screenreader
</h2>
<div id='text-1'>
<p>Device Used: Motorola Droid On Verizon</p>
<p>We briefly introduced TalkBack in the previous video while enabling
<i>Accessibility
</i> from the settings menu.Here, we show off some of this screenreader's features.
</p>
<p>TalkBack is designed to be a simple, non-obtrusivescreenreader. What this means in practice is that you interactdirectly with your applications, and not withTalkBack. TalkBack's job is to remain in the background andprovide the spoken feedback that you need.
</p>
<p>TalkBack works with all of Android's native user interfacecontrols. This means you can configure all aspects of the Androiduser interface with TalkBack providing appropriate spokenfeedback. What is more, you can use most native Androidapplications --- including those downloaded from the AndroidMarket with TalkBack providing spoken feedback.
</p>
<p>Here are some examples of Android applications (both from Google as well as third-party applications available onmarket) that work with TalkBack:
</p>
<ul>
<li>Google Maps: Perform searches, and listen to directions.
</li>
<li>YouTube: Search, browse categories and play.
</li>
<li>Simple Weather: Listen to local weather forecasts.
</li>
<li>Facebook: Moving around on the social Web.
</li>
</ul>
<p>But in this video, we'll demonstrate the use of a very simple butuseful Android application --- the Android Alarm clock.
</p>
<ul>
<li> Launch: I launch the alarm clock from Marvin's eyes-free application launcher.
</li>
<li>TalkBack: TalkBack takes over and starts speaking.
</li>
<li>Navigate: Navigating with the trackball speaks the alarmunder focus.
</li>
<li>Activate: Activating with the trackball produces appropriate feedback.
</li>
<li>Navigate: Selected alarm displays its settings in a list-view which speaks as we navigate.
</li>
</ul>
</div>
</div>
<div id='postamble'>
<p class='author'> Author: T.V Raman
<a href='mailto:raman@google.com'><raman@google.com>
</a>
</p>
</div>
</div>
T. V. Ramanhttp://www.blogger.com/profile/03589687652590194428noreply@blogger.com9