[blindza] Fw: Turn your mobile into an EYEPhone: App lets blind people 'see' their surroundings by converting photos into SOUND

  • From: "Jacob Kruger" <jacob@xxxxxxxxxxxxx>
  • To: "NAPSA Blind" <blind@xxxxxxxxxxxx>
  • Date: Sat, 8 Mar 2014 00:30:38 +0200

Here's quite an involved article mentioning both the vOICe for android, and eyeMusic for the iPhone - see below.


FWIW, in the vOICe for android, there's also an option to use/turn on musical notes, instead of the more standard soundscape sounds, but, like mentioned in other message, still think it might also then lower the detail levels a bit, even if it sounds slightly less 'odd' to some people...<smile>

Stay well

Jacob Kruger
Blind Biker
Skype: BlindZA
"Roger Wilco wants to welcome you...to the space janitor's closet..."

----- Original Message ----- Hi All,

Today there was also an article in the Daily Mail in the UK. It is
a bit on the sensationalist side, with numerous inaccuracies, but
that is how things sometimes go. The article is appended.

Best wishes,

Peter Meijer


Seeing with Sound - The vOICe
http://www.seeingwithsound.com/winvoice.htm


Turn your mobile into an EYEphone: App lets blind people 'see' their
surroundings by converting photos into SOUND.

- The vOICe technology was developed by engineer Peter Meijer in 2007

- It turns images into 'soundscapes' where shapes produce different noises

- For example, a diagonal line is converted to a string of rising musical notes

- Professor Amir Amedi has been training blind people to see using vOICe

- The vOICe Android app uses a phone's camera to record views and scenes

- The EyeMusic iOS app uses vOICe to help blind people see colour

- Colours are assigned different pitches in the same way shapes are

By Victoria Woollaston, March 7, 2014.

A breakthrough technology that helps blind people 'see' the world around them
using sound is now available in a smartphone app.

The vOICe system converts images taken on a mobile into 'soundscapes' by
assigning different musical notes and pitches to different shapes. For example, a diagonal line - such as staircase - is converted to a string of rising musical
notes.

The app, available on Android, uses the phone's camera to record scenes and
landscapes, and teaches users how to identify which sounds go with which shapes.

There is even an accompanying app called EyeMusic that adds colour to these shapes.

Figure caption: The vOICe system converts images into 'soundscapes' by assigning different musical pitches to different shapes. For example, a diagonal line is converted to a string of rising musical notes. The app is available on Android, pictured, and explains how users can train themselves to identify the shapes.


The vOICe system was created by Dutch engineer Peter Meijer in 1992.

It uses pitch for height and loudness for brightness by scanning the room from
left to right.

A rising bright line is converted into a rising tone, a bright spot - such as a lamp - is a beep, a brightly filled rectangle - such as a window during daylight
- becomes a noise burst and a vertical grid - such as waffle or trellis - is
converted into a rhythm.

Since 2007, neuroscientist Professor Amir Amedi from the Hebrew University of
Jerusalem has been training blind people how to use this technology.

With around 70 hours training, Professor Amedi's participants recognise the
presence of a human form.

They are also able to detect the exact posture of the person in an image, and
imitate it.

Using brain scans, Professor Amedi was able to establish the outlines and
silhouettes of bodies cause the visual cortex in his participants' brains - an
area specifically dedicated in sighted people to processing body shapes - to
light up with activity.

The vOICe for Android app makes this technology widely available. It uses
augmented reality to convert live camera views to soundscapes for the totally
blind through sensory substitution and computer vision.

Figure caption: Using vOICe, researchers have also created EyeMusic for iOS,
pictured. It translates black and white images created by vOICe into colour.
Colours are represented by different musical instruments - higher pixels are
translated into higher notes while lower pixels are translated into lower notes
on the same instrument

Figure caption: Researchers first taught people to perceive simple dots and
lines. Then those individuals learned to connect the lines with junctions or
curves, gradually working up to more and more complex images, pictured

The app also features a talking colour identifier, talking compass, talking face
detector and a talking GPS locator, while CamFind visual search and Google
Goggles can be launched from the vOICe for Android app by tapping the left or
right screen edge.

In Professor Amedi's most recent work, he has taken this a step further and
created the EyeMusic app for iOS devices.

It uses an algorithm to translate the original black and white images created by
vOICe into colour.

Colours are represented using different musical instruments - higher pixels of
an image are translated into higher notes on a given musical instrument, for
example, higher pitches on the piano, trumpet or the violin, while lower pixels
of an image are translated into lower notes on the same instrument.

Video: How the vOICe technology appears to blind people

The researchers from the Hebrew University of Jerusalem first taught people to perceive simple dots and lines. Then those individuals learned to connect the lines with junctions or curves, gradually working up to more and more complex
images.

Professor Amedi's team taught congenitally blind adults, meaning adults who were
born blind, to use what's called sensory substitution devices (SSDs).

For example, when a person uses a visual-to-auditory SSD, images from a video
camera are converted into 'soundscapes' that represent the images.

This allows the user to listen to and then interpret the visual information
coming from the camera, in that way 'seeing' with sounds.

After being taught how to use the SSDs, blind people could use the devices to
learn to read, with sounds representing visual images of letters.

This skill involved a region of the brain called the Visual Word Form Area
(VWFA), which in sighted people is activated by seeing and reading letters
(pictured)

After only tens of hours of training, blind people's VWFA showed more activation
for letters than for any of the other visual categories tested.

Pixels closer to the left side of the image are heard before pixels closer to the right side of the picture letting blind people determine shapes, colours and
the position of items.

The research has been reported in Cell Press journal Current Biology.

'The idea is to replace information from a missing sense by using input from a different sense,' explained Professor Amedi. 'It's just like bats and dolphins
use sounds and echolocation to 'see' using their ears.'

'Imagine for instance a diagonal line going down from left to right; if we use a
descending musical scale -going on the piano from right to left - it will
describe it nicely," continued Professor Amedi's colleage Ella Striem-Amit. 'And if the diagonal line is going up from left to right, then we use an ascending
musical scale.'

Source URL:
http://www.dailymail.co.uk/sciencetech/article-2575495/Turn-mobile-EYEphone-App-lets-blind-people-surroundings-converting-photos-SOUND.html

----------
To send a message to the list, send any message to blindza@xxxxxxxxxxxxx
----------
To unsubscribe from this list, send a message to blindza-request@xxxxxxxxxxxxx 
with 'unsubscribe' in the subject line
---
The 'homepage' for this list is at http://www.blindza.co.za

Other related posts:

  • » [blindza] Fw: Turn your mobile into an EYEPhone: App lets blind people 'see' their surroundings by converting photos into SOUND - Jacob Kruger