[asvs] Re: When Would Synthetic Vision be Useful?

  • From: "Jerry Weichbrodt" <gerald.g.weichbrodt@xxxxxxxxxxx>
  • To: <asvs@xxxxxxxxxxxxx>
  • Date: Thu, 30 Sep 2004 13:05:36 -0400

Hi Will,
Granted you can't adjust details of sound positioning in terms of where
sound hits the outer ear with bone conduction transducers, but you can't do
that with most headsets, either, unless you happen to have one of those
quadraphonic headsets from years gone by, or am I overlooking something?

As for what was displayed:  It was in many cases web pages--modern ones with
the nav bars here and there, perhaps a tree display of information, and
suchlike.  Some of the other diagrams I was given to think were perhaps UML
diagrams.  The speaker said they were similar to flow charts but with
different meanings for diamonds, triangles, and so on.

I stilll have trouble featuring really listening to stuff like this and
having it make sense, but, assuming it's possible, it seems to me it would
have tremendous potential in business and academic settings.


----- Original Message ----- 
From: "Will Pearson" <will-pearson@xxxxxxxxxxxxx>
To: <asvs@xxxxxxxxxxxxx>
Sent: Thursday, September 30, 2004 12:37 PM
Subject: [asvs] Re: When Would Synthetic Vision be Useful?

> Hi Jerry,
> Nice scenario, and thanks for some great use context.
> Hmmm.  I doubt that bone conduction would provide the spatial positioning
> that we need.  You could easily get the interaural time, intensity and
> probably even phase, differences reproduced through it, but I'm not sure
> about Head Related Transfer Function (HRTF).  Basically, HRTF is the fact
> that sounds from different locations hit the outer ear at slightly
> places.  The brain maps the location of where the sound hits to the
> location of the sound, and this helps with sound source localisation,
> especially with elevation.
> At the moment, I'm thinking of having a display straight in front of you,
> but thinking about it, maybe the ability to spatially move the grid might
> good.  If someone was standing in front of you and talking, it's likely
> signals from the display and the speaker would get intermixed, causing
> degree of cancellation and loss of signal.  So, if we could move the
> out of the way, so to speak, we could avoid this problem.  Technically
> DirectSound provides a neat way to do this.  You have a listener object
> representing the listener's location in space.  If we say, wanted to shift
> the display to the left, we could just move the listener to the right,
> avoiding having to redraw the whole sonic display.  As we're using thin
> as our drawing canvas, the possibilities are pretty limitless *smile*.
> As for being able to hear both display and environmental sounds, as David
> Poehlman pointed out, you can get environmental headphones that allow
> environmental sounds to pass to the outer ear.  These tend to be used in
> research projects examining auditory interfaces on mobile devices, such as
> GPS units.
> It's a pretty good idea though.  The synchronicity of being able to look
> diagrams at the same time as everyone else is appealing.  Can you give
> details on the sort of image?  Also, what sort of things would you like to
> do with it?
> Will

Other related posts: