All,Windows-Eye script language is interesting. The two key aspect to script language access for a screen reader is:
* Being able to find out what information is available via the window tree structure, instead of using your mouse to navigate the screen. For example:
A dialog appears, with four controls. A list box, two buttons and a information window called static text. The static text updates every time you move through the list box.
By using the dialog tree structure, you can create a script to tell you when the static text updates as you move through the dialog.
* Being able to control your Speech and Braille devices and obtaining useful information from the Screen reader.
Note: COM access to applications is very dependent on what has been exposed. Word exposes a lot of the underlying document DOM so you can find out a lot of information about the document. EG: the location of the cursor. Other applications do not provide this level of information. Most exposed COM is designed towards automation of the underlying application or data extraction. All COM I have thus far seen is not designed towards obtaining information from the screen of an active dialog or control.
One possible advantage Window-Eyes has with their approach is for Application developers to use the exposed COM of Window-Eyes for self talking applications.
Sean __________View the list's information and change your settings at http://www.freelists.org/list/programmingblind