Re: Sodbeans 0.5 Release in Early July

  • From: Jamal Mazrui <empower@xxxxxxxxx>
  • To: Andreas Stefik <stefika@xxxxxxxxx>
  • Date: Tue, 15 Jun 2010 08:27:19 -0400

Unfortunately and to my surprise, I do not think there is a reliable, programmatic technique available for determining when JAWS, System Access, NVDA, or Window-Eyes has stopped speaking, or put another way, determining whether speech is currently being output. I think there is a way with the SAPI API, at least SAPI4.


Does the Hop language run on top of the Java virtual machine? If so, I guess it may have access to Swing classes for building a UI. That approach would be similar to Jython, JRuby, and Grails.

Jamal


On 6/14/2010 10:59 AM, Andreas Stefik wrote:
Jamal,

Thanks on the SayTools material. We've been wanting to add in windows
eyes support, pulling support from say tools, but haven't had a chance
to add it yet. We also have mac support, so if you want to add that
into say tools from our implementation, feel free. Actually, I was
wondering, have you figured out any way to get the screen readers to
inform you when text is finished speaking or when the screen reader
decided to start speaking something else on its own?

As for user interfaces, we won't have UI support before this release.
However, we just finished a way to make native calls down to Java or
C++ from Hop, and as such, creating an API for user interfaces is
definitely possible. If cooking up an API for that sort of thing
interests you at all, I certainly wouldn't complain.

Stefik

On Mon, Jun 14, 2010 at 8:15 AM, Jamal Mazrui<empower@xxxxxxxxx>  wrote:
Congrats, Andreas, on the pprogress your team has made!

Let me make sure you are aware that SayTools includes code for speaking
through the APIs of Window-Eyes and System Access.

One question I have is whether it is currently possible to create graphical
user interfaces with Sodbeans.  Sorry if that has been explained already.

Jamal


On 6/13/2010 4:10 PM, Andreas Stefik wrote:

Hey folks,

I know some of the folks on the list (e.g., Sina, Jamal, Louie Most),
have been involved in the Sodbeans project, which my team and I are
designing as a prototype to show how to make programming languages and
development environments easier to use for blind users. My team, which
is both at southern Illinois university edwardsville, and washington
state university, has made significant progress, and a working, alpha,
build of our tool is nearly ready for release. As I've been working to
develop this technology now for almost five years, I can't tell you
how personally excited I am to get the software finally out there.

It looks like our final feature set for this first release is going to
include the following:

1. Talking debugger, which aurally tells the user what is happening as
you debug. For example, our debugger might tell you the values of
variables as they change, whether you have called a function, created
an object, or done other actions.

2. Talking compiler, which tells you whether there are compiler
errors, and summarizes aurally the problems, if any, in the source
code.

3. A custom programming language called Hop. Hop is a fully
functioning programming language that we've designed in formal
experiments where we watch people program using audio only
environments. Besides typical features you would expect in a modern
language (e.g., control structures, objects), in Hop, accessibility is
a first class citizen. To give you an example of how HOP can help
blind users, if you wanted to write a computer program to make your
screen reader speak in C++ or Java, it's time consuming and you need
some expertise on how to connect to various screen reader
architectures. In Hop, you can connect to any arbitrary screen reader
the user has loaded by saying:

say "How's it going, screen reader?"

and you will hear the TTS routed appropriately. Right now we support
JAWS, NVDA, SAPI, and Mac out of the box and we're working on adding
more readers as we go.

4. Full integration into Oracle's NetBeans IDE. The accessibility
support in Sodbeans is built on the Sappy platform, which is built on
the NetBeans platform. We have fixed an enormous number of
accessibility problems and bugs since our Sappy 0.5.3 release and have
added NVDA support (Thanks, Sina!).

So, that's what we've been working on. After we release, we would love
to get the community even more involved. We welcome contributions to
the standard library in Hop, like classes for data structures, more
screen reader support, or other features. We would also love to get
feedback on how we can improve the user interface for the blind or
even get just general opinions on where the research should go.

Thanks for listening everyone,

Andreas Stefik, Ph.D.
Department of Computer Science
Southern Illinois University Edwardsville
__________
View the list's information and change your settings at
//www.freelists.org/list/programmingblind



__________
View the list's information and change your settings at //www.freelists.org/list/programmingblind

Other related posts: