Project Eyes-Free aims to enable fluent eyes-free use of mobile devices running Android. Target uses range from eyes-busy environments like driving, to use by people who are unwilling or unable to look at the visual display. You can get a high-level overview of more potential use cases for Eyes-Free from this recent New York Times article. As described in the article, we are releasing components from project Eyes-Free as they become ready for end-user deployment.
Though the underlying source code has been available for some time from our repository on Google Code, we've now posted the first public release of the eyes-free shell on the Android Marketplace. Users of the eyes-free shell can conveniently launch talking applications. Along with this release, we've also made available a collection of applications to turn mobile devices running Android into eyes-free communication devices.
Each of these applications have been written to be useful both to end users and as a means of helping the developer community come up to speed quickly as they develop eyes-free applications for Android:
A key innovation is the use of the touch screen to enable one-handed, eyes-free dialing of phone numbers using the touch screen. The dialer comes with a talking phone-book that enables users to quickly select a desired contact using the touch screen.
Knowing Your Location
This mini-application announces your present location based on information acquired via GPS and the cell network. It speaks your current heading using the built-in magnetic compass, looks up the current location on Google Maps, and announces the location in terms of a nearby address and street intersection.
This mini-application announces useful information such as battery state, signal strength, and availability of WiFi networks.
Date And Time
This mini-application provides single-touch access to current date and time.
We will be uploading video tutorials demonstrating the use of these applications to YouTube over the next few weeks. Please see the Eyes-Free project home page for these links as they become available. As always, we welcome your feedback and look forward to hearing from you in our discussion group.
by Charles Chen, Software Engineering Team and T.V. Raman, Research Scientist