Re: Admin: Whats OT and What Not (Was: Semi OT: USefulness of Auditory Icons, Mercator)

  • From: Veli-Pekka Tätilä <vtatila@xxxxxxxxxxxxxxxxxxxx>
  • To: programmingblind@xxxxxxxxxxxxx
  • Date: Sat, 24 Nov 2007 15:47:08 +0200

Hi Matthew,
I think you might have a point, I'm sorry. On the up side, at least I do
change my subject lines  and tag OT bits as such, even though I don't
perhaps cut where I ought to. 

-- 
With kind regards Veli-Pekka Tätilä (vtatila@xxxxxxxxxxxxxxxxxxxx)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila

Matthew2007 wrote:
> 
> You're indulging in the exact same behavior you're criticizing, but I feel
> you can do so just like the rest of us.

> From: "Veli-Pekka Tätilä" <vtatila@xxxxxxxxxxxxxxxxxxxx>
> Subject: Re: Semi OT: USefulness of Auditory Icons, Mercator (Was: Sonified
> Debugger vs. Screenreader Question)
> > Hi Jared et al,
> > To get to the actual topic, skip down to end fluff.
> >
> > Fluff:
> > This first thing is not addressed to Jared in particular but almost
> > everyone in this thread:
> > I hate to be a kiljoy but:
> > Once again, on this list although not being a moderator here, I think
> > we're getting pretty badly off-topic. Research is fine, but a discussion
> > on the merit of research itself, the McDonalds analogies, whether or not
> > someone likes music in the background with speech etc... Just are
> > neither blind programming, nor gathering valuable research data or
> > methodology on blind programming. screen reader internals, from a
> > programming point of view certainly are on topic, but please do start
> > another technical rather than end user thread on it, take mostOT things
> > off-list and tag the rest as OT, and change the subject lines
> > accordingly so that I can tell what the last reply in a message is
> > about, without reading through the whole thing. Warning: If the subject
> > doesn't catch my interest, I won't read the message unless the author is
> > a regular I like. These are things a lot of people never seem to do here
> > and I've ranted on this before so I'll stop.
> >
> > If this was a usenet programming news group, a lot of people would have
> > complained already, at least that's how the Perl
> > culture is like in comp.lang.perl.misc, which I've been reading about ay
> > year now.
> >
> > Errm seems my own mesage has a lot of OT:ish interesting stuff, so who
> > am I to talk, <grin>. Now here's a guideline, if you're mostly going to
> > reply to the fluf bits marking something that's OT here to me, please do
> > reply, but reply off-list, change subjets and snip accordingly.
> > End fluff.
> >
> > Regarding Auditory icons, I'd like to share an experience which might
> > shed some light on attitudes:
> > Initially when I read about auditory icons about a year ago I was not
> > really convinced, though found the article mightily interesting. The
> > thing I'm refering to here is:
> >
> > [37] W. Gaver, "The SonicFinder: An Interface That Uses Auditory Icons",
> > Human-Computer Interaction
> > Vol. 4, No. 1,  1989 pp. 67-94
> >
> > Another such incident happened a little later when I read The UI guy
> > Alan Cooper stating quite plainly, in About Face 2.0, that computers
> > should make reassuring noises when they work well. We have been
> > conditioned by rude error beeps in current computer systems.
> >
> > Then I started experimenting with Windows sound schemes as a user. I
> > tried fast AT&T sampled speech, the fastest Dolphin Orpheus formant
> > synth setting of 700, despite not being a native English speaker, and
> > the bits of audio borrowed from MacOS X. The AT&T was easy to listen to
> > but it was not very fast and efficient. But the very rapid formant
> > synth, on the other hand, took me quite a long time to understand, too.
> > now I never like music in the background personally, despite loving
> > music and doing computer music myself (when I listen to music I
> > generally do nothing else), but when I document read something I
> > intensely concentrate on the speech and can understand it blazingly
> > fast. HOwever, when I get an unexpected, rapid spoken prompt in response
> > to some off-screen event, e.g. battery running out or braekpoint hit, it
> > takes me quite a while to actually realize aha this is not from the
> > screen reader, and also, I grasp the meaning way after having already
> > heard the prompt.
> >
> > In my informal experiments I found out that the MAc OS X sound scheme
> > actually works much better than any of the speech synth prompts. I do
> > think it would be interesting to convey some attributes in the UI with a
> > conventional subtractive analog synth, specially for people who have a
> > mental model of such a synth's operation, however this experiment of
> > mine used static samples in stead. In brief, what I found out was that
> > for telling binary info, like whether event e happened or not, I could
> > recognize the uniqueness of a particular sound effect very quickly after
> > the beginning of a sample, where as a speech prompt took much more
> > effort to parse in my head, when used for the same purpose. So all this
> > goes to show that your initial attitudes might be wrong, too.
> >
> > Fluff:
> > As a historical note before sample playback synths were common place and
> > ROM was expensive, the guys at Roland figured out that the attack
> > portion of a sound is mightily important to recognizing natural,
> > acoustic instuments. Cut the attack portion of a piano sound away, and
> > it is hard to tell based on the decay tail, that it is in fact, a piano
> > sound. So they sampled the attacks and generated the rest of the sound
> > using synthesis techniques, and it worked out surprisingly well at the
> > time. Personally, I've never been a big fan of the Roland D series synth
> > myself, but hey, that's just me.
> > End fluff.
> >
> > The last thing I wanted to ask about is the research screen reader
> > Mercator mentioned at least in:
> >
> > @article{mercator,
> >  title={{Transforming Graphical Interfaces Into Auditory Interfaces for
> > Blind Users}},
> >  author={Mynatt, E.D.},
> >  journal={Human-Computer Interaction},
> >  volume={12},
> >  number={1 \& 2},
> >  pages={7--45},
> >  year={1997}
> > }
> >
> > Pardon the LaTeX notation. Mercator used both auditory icons and a tree
> > navigation approach I haven't seen since in any Windows screen reader
> > and thus am interested in. I even found the source code on-line but
> > could not find an ancient enough SunOS machine, and the hardware synth,
> > for actually running the reader. Is there any way to run that reader in
> > LInux or some mainframe VM? In particular, I did not find the sound
> > effects used in for the auditory icons for UI overviews in the source
> > code package. Neither could I grasp exactly what on-screen elements of
> > the UI generated the differente levels of the tree. There are some
> > examples but ideally I'd liek to try out the reader to actually figure
> > it out myself and test whether I got its operation right. Any ideas as
> > to how that's possible? Are there on-line recorded demos of MErcator
> > usage, foor instance?
> >
> > Fluff:
> > All in all, I think it is remarkably sad how few of the research
> > prototypes, Web sites and such things referenced in accessibility
> > articles are actually linked to, and freely available on-line. Now,
> > after having read some good article, what I would like to do would be to
> > try out the things they created. Yet often this is a no can do. Isn't
> > science supposed to be public and open to everyone? In general, I don't
> > care about the source, binaries would be good enough.
> > End fluff.
> >
> > Guess that's all.
__________
View the list's information and change your settings at 
//www.freelists.org/list/programmingblind

Other related posts:

  • » Re: Admin: Whats OT and What Not (Was: Semi OT: USefulness of Auditory Icons, Mercator)