[blindza] Fw: Two Pasadena researchers seek integrated system to help the blind `see'

  • From: "Jacob Kruger" <jacobk@xxxxxxxxxxxxxx>
  • To: "NAPSA Blind" <blind@xxxxxxxxxxxx>
  • Date: Mon, 8 Nov 2010 22:25:34 +0200

----- Original Message ----- From: "Peter Meijer" <feedback@xxxxxxxxxxxxxxxxxxx>

Hi All,

Appended is an article from yesterday's San Gabriel Valley Tribune, about
work based on use of The vOICe.

Best wishes,

Peter Meijer


Seeing with Sound - The vOICe
http://www.seeingwithsound.com/winvoice.htm


Two Pasadena researchers seek integrated system to help the blind `see'.

By Beige Luciano-Adams, Staff Writer, posted November 7, 2010.

PASADENA - Bruce Benefiel sat facing five white strips arranged on a black wall.
He tries to decipher the shape, tracing the air with his fingers.

"The slanted line cuts across the horizontal," he ventured.

A voice from off-camera asked him if he sees the lines overlapping: "Have
another look," the voice urges.

"Oh, I see," said Benefiel, struggling for a minute before nailing it. "Is that
supposed to be a house?"

Benefiel has been blind since birth.

But with a pair of camera-equipped glasses and a hand-held computer that
translates images into raw sounds - all fed to him via earbugs - he is seeing
the world around him.

For the last 10 months, Pasadena-based partners Luis Goncalves and Enrico Di
Bernardo have been working with trainees like Benefiel in their MetaModal
project, developing a prototype for what they envision will be an integrated
platform for high-tech blindness aides.

Or, as Di Bernardo terms it, an iPhone for the blind.

The two researchers currently are funded by a National Science Foundation grant,
but they note that their mandate is more social than scientific.

"Of course there are a lot of people working in the area of restoring sight," Di Bernardo said, explaining that MetaModal is using mostly extant technologies,
like the sensory substitution behind the "vOICevision technology" glasses
Benefiel wears.

"We're not so much looking at restoring sight. We're working on providing
independent living for people who are blind," he said.

The goal is to help blind people enhance their awareness of the world around
them and give them confidence to enter it.

"A lot of blind people don't leave their bedroom. They just don't take on life,"
said Di Bernardo.

Benefiel, MetaModal's irrepressible first trainee, would be an exception to that
rule.

"I can see you," he declared on a recent morning. "I can see Rico, Luis - I knew his head was bald as soon as I met him. And I can tell if Rico shaved, which he
did today."

Goncalves and Di Bernardo murmured in agreement.

Benefiel's impairment stems from being born three months early.

"In the `50s, that was a death sentence," he said. "I never really had vision.
I faked it like all blind people do - `Oh! I see shadows,"' he mocked.

But the glasses, says Benefiel, have "added so much to my world, because now I
have an idea of what sight is...

"I could never quite grasp the concept. Now it's like having invisible fingers
that come out of my head."

The Goodwill employee says he uses the prototype to try cleaning his house.

"I can do everything but chase my cat around, because my cat is black. It
doesn't show up on this thing," he said.

Lighter objects make louder sounds, objects higher in the frame of vision make
higher sounds, and sound increases on one side or the other if you look
off-center from an object.

Trainees start off in front of a black felt wall with simple, white felt shapes - circles, horizontal and vertical lines that they can touch as they learn to
recognize the sounds.

The idea is that they'll fine-tune their sense of perception through repeated
exercises, eventually using the device to map spatial environments.

Once they learn how the sound works and they get a sense of centering objects, trainees can start to imagine where things are in space - and reach for them, or walk towards them as they look down at the ground to see where they're going,
explains Goncalves.

He says this virtual hand-eye coordination catches on fairly quickly.

"At the beginning it can even be hard to center things, but they get better at
reaching for it. After three months of trying, they're pretty confident,"
Goncalves said.

Vince Fazzi, an orientation and mobility specialist who works with MetaModal
trainees, compares the company's image-to-sound device to its sound-only
precursors, noting this is the first time a camera has been integrated.

While still complex, Fazzi says he can see the system as a compliment to canes
or dogs, especially if the barrage of sounds coming in could be refined.

Sonic guides, he recalls, fell out of favor.

"People realized they could travel using their cane and get to where they wanted to go and really didn't need to know that much about the world around them," he
said.

Goncalves said they've been experimenting with this particular sound-based
sensory substitution, developed by partner Peter Meijer in Holland, because it
was available. But their vision is bigger.

For their next project, which they hope to fund either with another NSF grant or private investment, they want to try something different, maybe another kind of
sensory substitution, or something that taps their backgrounds as
Caltech-educated computer vision experts.

The technology they have in mind would allow people to recognize objects and
faces, read text, detect things moving in space.

In practice, this could mean the glasses scan bar codes in supermarkets, or
items at home in your fridge, or recognize faces or landmarks when you're out
walking.

Ultimately, they said, it would have to be "all in the glasses" to make sense.

"When you look at common electronic aides to assist the independent living of people with blindness," Di Bernardo said, "they would have to carry a backpack full of different gadgets and take out one they need at any particular time."

Instead, theirs would be a single, integrated device, and function as an open
platform where third parties could program new apps, explained Goncalves.

He predicts that speech-based commands and other computer interpretations of the environment could mingle with sensory substitution schemes to give the user raw
information, thereby maintaining the experiential component that MetaModal's
trainees are currently struggling to master.

"The key is to make the user interface as seamless, easy-to-use, and functional as possible," said Goncalves said. "No need to carry a bunch of gadgets that do one thing each, no need to fumble with your phone, no need to hold anything in your hands, which is important, since often one hand will already be occupied
holding your cane or guide dog."


__________ Information from ESET NOD32 Antivirus, version of virus signature 
database 5602 (20101108) __________

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com



----------
To send a message to the list, send any message to blindza@xxxxxxxxxxxxx
----------
To unsubscribe from this list, send a message to blindza-request@xxxxxxxxxxxxx 
with 'unsubscribe' in the subject line
---
The 'homepage' for this list is at http://www.blindza.co.za

Other related posts:

  • » [blindza] Fw: Two Pasadena researchers seek integrated system to help the blind `see' - Jacob Kruger