Make Tech Easier - Monday, June 4, 2018 at 11:25 AM
How Smartphones, AI and Crowdsourcing Are Helping the Blind and Visually
Impaired
Smartphones aren’t just glowing rectangles full of visual stimuli – they’re a
portable sensor package and computer, and a growing number of companies are
putting them to work for the blind and visually impaired. Some apps are old
news, like text-to-speech and color detection, but some are using the
connective power of the Internet and artificial intelligence to do everything
from navigation to recognizing emotion on peoples’ faces.
Though there is a growing number of apps available for Android, the most
popular platform for assistive apps hands-down is iOS. Thanks to built-in
capabilities like the VoiceOver screen reader, the iPhone attracted far more of
the accessibility-conscious in the beginning, and most similar apps since then
have been developed for it. In the past few years apps for counting money,
reading light intensity, and reading barcodes have popped up everywhere, but AI
and fast mobile internet have been game-changers.
The Machines: Artificial Intelligence
<p></p>
Text-to-speech and color recognition are useful, but the next generation of
seeing-eye apps is being powered by artificial intelligence. Apps like
Microsoft’s Seeing AI<https://www.microsoft.com/en-us/seeing-ai> and
Envision<https://www.envision.ai/> are going beyond simple scanning and
recognition and using machine learning and neural networks to unlock a whole
new set of tools for visually-impaired users.
Seeing AI, for example, can recognize people, guess their age, and give you an
idea of how they’re feeling. Its development team is working on its ability to
tell you what’s going on in a scene as well. It may only be a matter of time
before smartphones can provide continuous real-time narration about a user’s
surroundings.
There seems to be no shortage of companies working on incorporating AI and
machine learning into accessibility, with projects like
TapTapSee<https://taptapseeapp.com/>, AiPoly Vision<https://www.aipoly.com/>,
and even Google
Lookout<https://www.blog.google/topics/accessibility/lookout-app-help-blind-and-visually-impaired-people-learn-about-their-surroundings/>
working on and releasing apps that use phone cameras and sensors to decode the
world.
The Humans: Seeing Eyes
<p></p>
As good as AI currently is, it’s not foolproof, and it probably won’t be as
much help in more complex situations. When you have a problem that requires
some human input, there are some apps out there that make it easy for
visually-impaired individuals to get in touch with a sighted volunteer. The
most popular is Be My Eyes<https://www.bemyeyes.com/>, which crowdsources over
a million volunteers to help out 80,000 users. As long as there is a decent
Internet connection, the volunteer can see through the phone’s camera and relay
information to the user.
Aira<https://aira.io/> is an even more sophisticated service, using
video-camera glasses (similar to Google Glass) to give a video feed to an
assistance agent. It comes with a high monthly fee, but the agents are trained,
and at least Google Glass technology came in handy somewhere.
Getting around: navigation apps
<p></p>
While navigation capabilities are increasingly being built into other apps,
there is still a whole class of apps out there specifically meant to help blind
and visually-impaired people get where they need to go. One of the most popular
is BlindSquare<http://www.blindsquare.com/>, an app that not only gives you
“turn left,” “turn right” directions, but describes your surroundings and
points of interest to you.
Microsoft’s Soundscape app does something similar, as do ViaOpta
Nav<https://www.viaopta-apps.com/Viaopta-navigator.html>, Seeing Eye
GPS<http://www.senderogroup.com/products/shopseeingeyegps.html>, and several
others. They offer features like giving you directions by vibration, letting
you record memos that can be triggered to play back at certain locations, and
as AI begins to be more and more integrated, they will likely begin providing
even more real-time information.
Conclusion: the future of assistive tech
As technology continues to get smarter, smaller, and more widespread, options
for visually-impaired and blind users will continue to expand. AI and machine
learning are two of the most promising fields, but the internet of things is
also an interesting development – a network of sensors surrounding and
interacting with a blind user would open up a lot of possibilities. Opening
these doors will not only benefit users, but society as a whole, giving
talented and unique individuals more opportunities to help build the future.
https://www.maketecheasier.com/smartphones-ai-crowdsourcing-helping-blind-visually-impaired/
David Goldfield
Assistive Technology Specialist
Feel free to visit my Web site
WWW.DavidGoldfield.info<http://WWW.DavidGoldfield.info>