Re: Auditory interface ideas, what would help?

  • From: "qubit" <lauraeaves@xxxxxxxxx>
  • To: <programmingblind@xxxxxxxxxxxxx>
  • Date: Tue, 1 Sep 2009 22:09:31 -0500

Great! I'll take a look.
Thanks.
--le

  ----- Original Message ----- 
  From: Andreas Stefik 
  To: programmingblind@xxxxxxxxxxxxx 
  Sent: Tuesday, September 01, 2009 9:11 PM
  Subject: Re: Auditory interface ideas, what would help?


  Howdy,


    Hi Andreas --
    Is your dissertation available online?

  Yes, it is available on my homepage: 
http://www.cs.siue.edu/~astefik/Papers.php

  Do a search for the word "Dissertation" on the page and you should find it 
pretty quickly. Sina (who is on this list), told me that it was a pretty 
accessible PDF (And I really hope so!), but no guarantees.

   
    What languages did you work with?

  Right now our tool integrates a custom programming language HOP into 
NetBeans. We are doing this specifically because part of our experimentation 
and testing regiment analyzes whether the actual language itself effects the 
performance of the programmer (including whether it adversely effects the 
blind). I'm honestly not sure whether it does yet, but my intuition tells me it 
does ...

  This inevitably makes designing the tools time consuming though, because we 
need to implement our own compilers and debuggers. On the other hand, it makes 
it vastly easier to generate high quality auditory cues for folks. That's a 
huge help.
   
    I remember many battles with scope management back when I worked on C++ 
compiler and tools.  There were a number of people working on components of 
early IDEs for C++, which was difficult to manage as the language was still 
evolving and changing.  But scope was one thing that received a lot of 
attention, especially in the front end where it influenced parsing and typing 
and beyond.

  I'm not surprised, I spent about a year testing scoping cues in a previous 
version of our environment. Specifically, I came up with this technique (called 
artifact encoding), that allows you to basically give someone a swath of code, 
then have a computer analyze what they came up with using something very 
similar to DNA encoding techniques(long -- complicated -- story), and it will 
pop out some numbers that seem to indicate how well the cues told you what was 
going on. There's a long experiment about it in my dissertation if you want the 
details. Nothing's perfect, and my techniques aren't either, but they really 
helped us improve our auditory cues for scoping.
   
    It is interesting now to see how things have evolved.  I worked on it back 
in the stone age (hmm, maybe the stone age through the bronze age).
    I would love to read about your work. Was it on java?

  For my dissertation, I wrote a c-compiler and custom c virtual machine that 
allowed us to run tests. Our new language, and new VM, is a much more powerful 
environment. There's a ton of work left to do (e.g., the installer on netbeans 
is horribly inaccessible, for example), but ... ya know, we're making progress, 
and we're sure trying. We've applied for some grants and such. If we get lucky 
and win one, things will only get better.

  Stefik

Other related posts: