[bct] Transcription

  • From: "Rose Combs" <rosecombs@xxxxxxxxx>
  • To: blindcooltech@xxxxxxxxxxxxx
  • Date: Tue, 1 Nov 2005 18:49:29 -0700

This is not exactly my career cast, however, the following was an article I
wrote for my office newsletter.  Some people thought it was confusing, some
said it was the content of the article, some said it was the concepts in the
article.  It is already published, but I'd like your opinions.  

*************
Written by Rose Combs 
Imagine you have come in to your desk to work, but your computer screen is
not visible, except for one line.  You can use your arrow keys to move
around, and you can tell that you moved up/down a line.  This is similar to
how a screen reader works.  The two most popular screen readers in the
states are JAWS (Job Access With Speech) and Window Eyes.  The product I
have used since we moved into Windows 95 in 1999 is JAWS.  

JAWS takes a look at what is going on behind the scenes and then reports
what the screen is showing.  I can see a small portion of the screen at a
time, or rather, I can only hear one portion of the screen at a time.  JAWS
is controlled for the most part by the number keypad of the keyboard, and
from there I can give it instructions to read the whole document, read the
line I am working on, or tell it to read from the edge of the screen to my
current cursor position.  I can also adjust the speech rate, the voices used
for various characteristics of the program, and tell it how I specifically
want each program I use to work.  

There are various sound schemes that can be created, some ship with the
product.  The sound scheme I most often employ is active when I am on the
internet and all the links on the page as the page is read to me are in a
female voice.  Quotations are in a different voice.  Headings also come up
in different pitches of the normal JAWS voice.  If you see colors on the
page, I hear different voices and could also add different sounds to alert
me to the various elements on a page.  The catch on the internet is that all
graphical links must have an Alt tag so that JAWS will know how to tell me
what it is.  

When I press F5 to enter the interfaces, it takes about 30-45 seconds for
JAWS to read me the information, job number, date of dictation, MR number,
report type and physician.  At first, it sounds like a lot of numbers that
sounds like a mess.  I can, route the JAWS review cursor to the PC cursor
and then re-read the line if necessary.  Then when I get to the list of
admissions/accounts, I listen to most of each line before I decide if it is
the correct admission, one line at a time with about 30 seconds per line,
and I run the speech quite fast.  

JAWS comes equipped with scripts to help with spell check but does not
detect the red lines, so I turn that feature off.  I normally run spell
check at the end of the report.  While in spell check, JAWS will read the
misspelled word and then the first suggestion in the list.  If that is what
I want I press <Alt><c> to change; otherwise, I can edit the word by using
the cursor, backspace and delete keys, or I press <Tab> to the suggestion
list and down arrow, at which point the next word in that list will be
spoken and spelled.  I use the typical keyboard commands to work through the
document, including adding words to the dictionary, ignoring it once or
always and so forth.  

Generally after I do a spell check I then command Jaws to read me the whole
document continuously.  I run Jaws at a fast rate, approximately 500 words
per minute; however with years of experience and even with its ability to
mispronounce many words, I recognize errors and if at any time while
listening to the document I need to correct something I can stop the
reading, return to where I heard the error and correct it.  I also set Jaws
to indicate words that are capitalized by changing the pitch of the voice to
a higher pitch.  If I find even one error, to my way of thinking it means
better quality of work for me,  so I proofread every report; however, some
errors may be missed even so, I hope not many.  

Some other tools I use include the Braille Dorland's Speller in seven large
volumes that is on my bookshelf, this particular one was copyrighted in
1965.  The actual Dorland's dictionary from the same time frame is in 49
volumes like the ones on my desk, but they include the definitions.
Obviously, there isn't enough room for them in the office.  I do have access
to things in braille now that I bought a Braille Note, a note taking device
with a braille display.  It has the ability to take notes, includes a
calculator, word processor, planner, address book, and more.  Because it
uses compact flash cards, I can store many books on the device, in
electronic braille and access them.  

I also bought an optical character recognition (OCR) program for my home
computer and a scanner, then scanned some of the Stedman's word books into
my computer and loaded those files onto the compact flash card for the
Braille Note and can now call up, for example, the Stedman's Surgical Word
Book.  With this as an electronic file I can perform searches to find some
of the information I need.  Other blind MTs are loading all the Stedman's
word books on compact disk to their computers.  

The last piece of equipment I may use is called an Optacon, which was how I
used to read my computer screen.  Essentially what this does is when I run a
small camera over a page of print it converts the image to an array that
holds only one of my fingers and what I feel is a vibrating tactile
representation of the printed letter.  With the Optacon you only see one
letter at a time, and in some books if the letters are huge, you only see
part of it at a time.  I probably can read at about 40 words per minute
using the machine, or could years ago anyway.  My memory is also a great
tool, except as I get older, sometimes it seems a bit faulty, like days when
words I can normally spell seem to elude me.  

Rose Combs
rosecombs@xxxxxxxxx 



Other related posts: