The following post was taken from a Facebook group covering assistive
technology.
Erich Manser
Report: Boston Marathon w/Aira virtual guide
For me, the Boston Marathon has always been about pushing myself to my
own physical limits. This year, it was more about pushing technology.
To be clear, the technology we tested was not developed so that someone
who is blind can go out and run safely on their own. More typically, it
is technology that is being used to help with everyday tasks, like
grocery shopping or reading the mail.
However, can you imagine if it became something that a person with no
sight could use to safely navigate a steadily-moving field of 30,000
marathon runners?
This was a step towards that.
THE TECH
Aira – Visual Interpreter for the Blind and Visually Impaired (website
www.aira.io ) is an emerging technology being used to provide access to
information for blind and visually-impaired people.
Currently, the solution consists of a smartphone app (iOS and Android),
a head-mounted camera and a portable mi-fi (mobile wi-fi) device. Many
people also use headphones while using the technology.
The technology works by allowing a remote, sighted agent to access the
video feed coming from the camera of the Aira user, once the user has
activated the Aira app. When connected, the Aira agent can describe the
user’s environment and surroundings, and assist with a number of
visually-oriented tasks.
Through my work in accessibility and assistive technology at IBM
Research, I became an early beta-tester of Aira in 2016, finding many
useful benefits with things like wayfinding and air travel.
When the idea began circulating about trying to run the Boston Marathon
while using Aira, I jumped at the chance, inspired by innovative friends
like Richard Hunter and Simon Wheatcroft.
On April 17, 2017, it was time to test. We knew these would be extreme
conditions, so expectations were kept to a minimum. We also held to all
of the usual precautions, such as having a human guide, using a tether
and wearing bright “BLIND” and “GUIDE” bibs.
Testing really began before the gun even went off, as ensuring an
accessible end-to-end experience is critical with any assistive
technology. I reminded myself that situational factors, like race
morning stress or my not being fluent with VoiceOver (the iOS
screen-reader), should not count against Aira.
Overall, Aira performed well. Really well, under the circumstances. It
may appear from below that more went wrong than went right, but I’d
suggest that gathering all experiences was really the point of the
exercise (literally, the exercise).
QUALITATIVE RESULTS:
Things that worked great:
* JESSICA at Aira – she’s #1! The human element, especially during
marathon suffering, brought real empathy & compassion to the situation
(it didn’t hurt that Jessica is a runner herself).
* Supplemental information – Virtual guide Jessica was able to ‘fill in
the gaps’ when physical guide David was otherwise occupied, negotiating
a pass, etc. Example: “runners on your right”, “water stop ahead”.
* Battery Life – At the halfway point, where we arranged to hand over
all devices for recharging, I was pleased to learn the Google Glass and
camera still had 50 –55% charge and the AT&T mi-fi had 82% (Jessica also
has access to camera and phone charge status). The iPhone, however, was
under 20%, though adjustments like using the screen curtain or turning
off text / email alerts may have prolonged it. The primary Bluetooth
headset I used gave no audible indication of battery life, but we also
had a backup pair, just in case.
* Connectivity – Overall, we can chalk-up the stability of both the
audio and video streams as a win. Though challenged during particularly
congested periods (race start, race finish), I was impressed with how
reliably the AT&T mi-fi kept me linked with Jessica, with only minor
hangs or drops.
* Accessible App – The Aira smartphone app was designed with all users
in mind, and is simple, elegant and intuitive. It is one of the best
examples of accessible app design I have seen.
Some opportunities to improve:
* Number of Devices – The Aira team knows this already, but reducing the
number of devices to be managed would help.
* Inaccessible Mi-Fi Device – As it stands now, once a blind or
visually-impaired person powers on the mi-fi device (with no certainty
it’s actually on), the best option is simply to leave it on until the
battery runs out, because the screen is hard to see and it gives no
audible feedback.
* Delicacy of Glass – In pre-race chaos, I was concerned about the
sturdiness of the glasses, carrying them around closed in my hand (to
preserve the battery until race start) through bumping, jostling crowds.
I was especially worried about losing one of the rubberized nose pieces,
which would have made wearing them while running uncomfortable, or
separating the frame from the unit itself (Aira also works with other
wearable manufacturers).
* Glass Location – For some reason, I’ve noticed extended periods for
the Aira app to locate the camera and glasses, even before race day.
This added stress when the gun went off, and I was still hearing
“finding glass”. I have performed routine app updates in an effort to
address.
* Added Mental Load – Admittedly, I have a hard time doing math in my
head after about mile-3 of any race. Using Aira to run is a new
experience, but it’s also another outward interaction you need to
manage, while also maintaining focus. I do believe perceived added
mental load will ease somewhat with time and familiarity.
* Volume Challenges – Though I consider the raucous Boston Marathon
conditions to be extenuating circumstances, my ability to hear the Aira
interaction was consistently challenged in this setting. This was
regardless of iPhone volume, headset volume, or which of my 2 headsets I
used. This made things difficult whenever Jessica was relaying
information concurrently with David, or when we passed cheering crowds
(which is much of the time, in Boston).
* Camera Range – Though surely an improvement over my own peripheral
field, the range of the camera being used is somewhat limited. Some
scanning head movements were required in order to give Jessica a fuller
view of the path ahead, which is a technique I also use when navigating
on my own. This created some scenarios when Jessica needed me to point
my head in ways other than I might otherwise.
* Slight Lag Time – Though we hadn’t previously noticed any significant
lag-time between Jess giving information and my ability to respond, our
training runs were on less-demanding country roads, and a <1 second lag
became apparent in this split-second setting.
* Image Quality – Distinguishing certain elements using the video feed,
such as a small puddle vs. a manhole cover, may be as difficult at times
for the Aira agent as it is for me naturally.
Possible Next-Level Enhancements:
* Artificial Intelligence – to allow predictive object avoidance
* Audio Enhancement – for greater volume
* Camera Enhancement – for 360 degree viewing
* Computer Vision – to allow for facial recognition
* Dashboard Enhancement – giving access to fitness tracker data
* Image Recognition – for improved obstacle detection
* Sentiment Analysis – allowing for mood detection in others
* Video Enhancement – for video independent of head movement
* Weather-Resistance – to handle heat, rain, sleet or snow
Reflections:
It was exciting to learn that Aira earned recognition among the coolest
wearables at this year’s Boston Marathon, though I notice the article
also mentions other cool tech, like shoes that give haptic feedback or
smart watches that track whatever you need.
In my mind, evolving the Aira solution to be very well-suited for a task
like marathon running may involve a merging of innovations, for an
immersive, augmented experience.
I can think of several budding technologies that could complement the
Aira experience, but some of the most exciting – in my humble, biased
opinion – include work being done at IBM.
It’s been exciting for me to be involved in pushing the limits of Aira
at this year’s Boston Marathon, but it would be especially exciting if
our work at IBM could somehow help bring it to the next level.
Race Results:
Though a much different experience this year, I’m not losing sight (see
what I did?) of the fact that I was healthy enough to participate in the
121st running of the World’s oldest annual marathon.
Official time: 5:11:34
My 8th Boston, it’s an honor and a thrill every time I get to experience
that course and cross that finish line. It’s a race unlike any other.
And as usual, there are so many who helped make it possible.
My love & gratitude to:
Lisa, Ellie and Grace-Margaret – My Girls
David Wei – my friend and ‘human guide’
Jessica Jakeway – my new friend and ‘virtual guide’
The Aira Team – for the unique opportunity
TEAM WITH A VISION – my favorite Boston sports team
Everyone who donated – THANK YOU all so much!
IBM Accessibility Research – for making people the priority
Meg, Alex and Olivia – I love you guys
Curt Cannata – recharging (tech) support on the course
All who came out to cheer – it really keeps us going
All the race volunteers – 9000 of them!
And last, but not least…
All first responders for KEEPING US SAFE
…now on to Ironman training! (October 2017)
--
David Goldfield, Assistive Technology Specialist Feel free to visit my
Web site WWW.DavidGoldfield.Info
You are invited to visit the moderator's Web site at WWW.DavidGoldfield.Info
for additional resources and information about assistive technology training
services.
To unsubscribe from this list, please email
blind-philly-comp-request@xxxxxxxxxxxxx with the word unsubscribe in the
subject line.
To subscribe from another email address, send email to
blind-philly-comp-request@xxxxxxxxxxxxx with the word subscribe in the subject
line.
To contact the list administrator, please email
blind-philly-comp-moderators@xxxxxxxxxxxxx