Can Hop use Java classes dynamically at runtime, or do wrappers need to be created in advance? To what extent is it strongly/weakly typed, statically/dynamically typed, object oriented/procedural/functional, interpreted/compiled?
Jamal On 6/16/2010 9:39 AM, Andreas Stefik wrote:
Yaa, exactly, desktop layout, especially, is amazing. I'm definitely willing to include a GUI library in hop, although I don't want to end up writing the entire standard library myself. Stefik On Wed, Jun 16, 2010 at 7:34 AM, Sina Bahram<sbahram@xxxxxxxxx> wrote:Java's built in layout managers/mechanisms are actually quite powerful. From the simplistic border layout to the grid and bag layouts, there are several ways to layout some very nice looking GUIs. The other advantages include automatic resizing support for when the window is restored or maximized. Take care, Sina -----Original Message----- From: programmingblind-bounce@xxxxxxxxxxxxx [mailto:programmingblind-bounce@xxxxxxxxxxxxx] On Behalf Of Jamal Mazrui Sent: Wednesday, June 16, 2010 7:47 AM To: programmingblind Subject: Re: Sodbeans 0.5 Release in Early July I do think a fruit basket example would be useful, as it facilitates comparisons with other development approaches. More importantly though, I am thinking that developing GUIs is a valuable skill for blind programmers to learn, so I am suggesting that it become part of the curriculum you are preparing whenever technically feasible. I hardly know Java or Swing, myself, but think that the GUI building approach I call Layout by Code (LbC) could probably be adapted to Java-Swing and/or Java-SWT (which may yield more accessible GUIs on some platforms). LbC involves a set of convenient wrapper methods that internally use auto-layout mechanisms of a GUI library (which Swing and SWT both include). One of the most challenging areas for blind programmers has been the layout of GUIs in a visually acceptable manner. Most layout tools are highly mouse and visually oriented. Making pixel calculations manually is possible but tedious and error prone. LbC tries to simplify this for common layout patterns. Good use of auto-layout mechanisms also benefit cross-platform portability, since the GUI can adapt appropriately to conventions and capabilities of the client platform. Jamal On 6/15/2010 8:56 AM, Andreas Stefik wrote:Jamal, Yes, Hop has access directly to the JVM and you can access swing directly. In fact, there is actually one command in there that uses a swing class as a popup input window for grabbing some input from the user. There's also a currently very small standard library, where you can wrap the VM access, so that users can just use normal hop calls. Are you thinking about this cause you think we should make a fruit basket? Stefik On Tue, Jun 15, 2010 at 7:27 AM, Jamal Mazrui<empower@xxxxxxxxx> wrote:Unfortunately and to my surprise, I do not think there is a reliable, programmatic technique available for determining when JAWS, System Access, NVDA, or Window-Eyes has stopped speaking, or put another way, determining whether speech is currently being output. I think there is a way with the SAPI API, at least SAPI4. Does the Hop language run on top of the Java virtual machine? If so, I guess it may have access to Swing classes for building a UI. That approach would be similar to Jython, JRuby, and Grails. Jamal On 6/14/2010 10:59 AM, Andreas Stefik wrote:Jamal, Thanks on the SayTools material. We've been wanting to add in windows eyes support, pulling support from say tools, but haven't had a chance to add it yet. We also have mac support, so if you want to add that into say tools from our implementation, feel free. Actually, I was wondering, have you figured out any way to get the screen readers to inform you when text is finished speaking or when the screen reader decided to start speaking something else on its own? As for user interfaces, we won't have UI support before this release. However, we just finished a way to make native calls down to Java or C++ from Hop, and as such, creating an API for user interfaces is definitely possible. If cooking up an API for that sort of thing interests you at all, I certainly wouldn't complain. Stefik On Mon, Jun 14, 2010 at 8:15 AM, Jamal Mazrui<empower@xxxxxxxxx> wrote:Congrats, Andreas, on the pprogress your team has made! Let me make sure you are aware that SayTools includes code for speaking through the APIs of Window-Eyes and System Access. One question I have is whether it is currently possible to create graphical user interfaces with Sodbeans. Sorry if that has been explained already. Jamal On 6/13/2010 4:10 PM, Andreas Stefik wrote:Hey folks, I know some of the folks on the list (e.g., Sina, Jamal, Louie Most), have been involved in the Sodbeans project, which my team and I are designing as a prototype to show how to make programming languages and development environments easier to use for blind users. My team, which is both at southern Illinois university edwardsville, and washington state university, has made significant progress, and a working, alpha, build of our tool is nearly ready for release. As I've been working to develop this technology now for almost five years, I can't tell you how personally excited I am to get the software finally out there. It looks like our final feature set for this first release is going to include the following: 1. Talking debugger, which aurally tells the user what is happening as you debug. For example, our debugger might tell you the values of variables as they change, whether you have called a function, created an object, or done other actions. 2. Talking compiler, which tells you whether there are compiler errors, and summarizes aurally the problems, if any, in the source code. 3. A custom programming language called Hop. Hop is a fully functioning programming language that we've designed in formal experiments where we watch people program using audio only environments. Besides typical features you would expect in a modern language (e.g., control structures, objects), in Hop, accessibility is a first class citizen. To give you an example of how HOP can help blind users, if you wanted to write a computer program to make your screen reader speak in C++ or Java, it's time consuming and you need some expertise on how to connect to various screen reader architectures. In Hop, you can connect to any arbitrary screen reader the user has loaded by saying: say "How's it going, screen reader?" and you will hear the TTS routed appropriately. Right now we support JAWS, NVDA, SAPI, and Mac out of the box and we're working on adding more readers as we go. 4. Full integration into Oracle's NetBeans IDE. The accessibility support in Sodbeans is built on the Sappy platform, which is built on the NetBeans platform. We have fixed an enormous number of accessibility problems and bugs since our Sappy 0.5.3 release and have added NVDA support (Thanks, Sina!). So, that's what we've been working on. After we release, we would love to get the community even more involved. We welcome contributions to the standard library in Hop, like classes for data structures, more screen reader support, or other features. We would also love to get feedback on how we can improve the user interface for the blind or even get just general opinions on where the research should go. Thanks for listening everyone, Andreas Stefik, Ph.D. Department of Computer Science Southern Illinois University Edwardsville __________ View the list's information and change your settings at //www.freelists.org/list/programmingblind__________ View the list's information and change your settings at //www.freelists.org/list/programmingblind __________ View the list's information and change your settings at //www.freelists.org/list/programmingblind__________ View the list's information and change your settings at //www.freelists.org/list/programmingblind
__________View the list's information and change your settings at //www.freelists.org/list/programmingblind