"google amazon accessibility" - Google News - Wednesday, October 2, 2019 at
2:18 PM
Google's Action Blocks makes Google Assistant more accessible on Android
devices - ZDNet
Google on Wednesday
announced<https://www.blog.google/outreach-initiatives/accessibility/action-blocks/>
a new tool to help users with cognitive disabilities take advantage of Google
Assistant on Android devices. Called Action Blocks, the new tool lets you
create shortcuts for anything the AI-powered assistant can already do on your
device. With Action Blocks, users can accomplish tasks -- such as calling a
family member, or booking a ride with Lyft -- with just one tap.
Users can trigger an Action Block by tapping on the corresponding icon on their
device's home screen. A user could choose a picture of a car, for instance, to
prompt Google Assistant to book a ride with Lyft.
Action Blocks take care of tasks that may already seem simple to the average
user, but they can present a challenge to people with cognitive disabilities
such as autism, Down Syndrome, traumatic brain injury (TBI) or dementia. Citing
research from the journal Inclusion, Google notes that there are an estimated
630
million<https://www.aaai.org/ocs/index.php/FSS/FSS15/paper/download/11724/11475>
people in the world with some form of cognitive disability.
Action Blocks is still in the testing phase, and Google is inviting caregivers
and family members of people with cognitive disabilities to join its trusted
tester
program<https://docs.google.com/forms/d/e/1FAIpQLSfcb-l0mCZ__09SSyFAuI_k2WBLR05URYbR_Stv9N42u7GTiw/viewform>.
Google says Action Blocks is the first of multiple efforts to build tools for
people with cognitive disabilities. The company has already leveraged AI to
improve accessibility for users with other types of disabilities. For instance,
earlier this year Google rolled out a tool called Live
Relay<https://www.zdnet.com/article/google-io-from-ai-first-to-ai-working-for-everyone/>,
which helps deaf people use the phone, even if they prefer not to speak.
Other leaders in AI are building similar tools. Amazon recently rolled out a
new Echo Show feature called Show and
Tell<https://www.zdnet.com/article/amazon-makes-the-echo-show-more-helpful-for-the-blind-and-visually-impaired/>,
which helps blind and visually impaired users ask Alexa what household pantry
items they're holding.
Prior and related coverage:
* Amazon makes the Echo Show more helpful for the blind and visually
impaired<https://www.zdnet.com/article/amazon-makes-the-echo-show-more-helpful-for-the-blind-and-visually-impaired/>
* Google I/O: From 'AI first' to AI working for
everyone<https://www.zdnet.com/article/google-io-from-ai-first-to-ai-working-for-everyone/>
* Google: Why Android developers should consider
accessibility<https://www.zdnet.com/article/google-why-android-developers-should-consider-accessibility/>
* Bringing Alexa to seniors: What can it teach us about
tech?<https://www.zdnet.com/article/bringing-alexa-to-seniors-what-can-it-teach-us-about-tech/>
https://www.zdnet.com/article/googles-action-blocks-makes-google-assistant-more-accessible-on-android-devices/
David Goldfield
Assistive Technology Specialist
Feel free to visit my Web site
WWW.DavidGoldfield.info<http://WWW.DavidGoldfield.info>