Hi, Naomi and all other ETNIans -- I am proposing a cloud-based computer application that communicates via iPhone with two or more people (A, B, ...). In this example, B uses sign language and A does not understand sign language. 1 - The iPhone of A records a video-clip of B who is using sign language and sends the streaming video to the application. 2 - The application analyzes the video, translates the motions in it to speech or text, and sends the voice or text back to A's iPhone. 3 - A hears or reads the translation and uses his iPhone to send a voice or SMS response back to the application. 4 - The application converts that voice or SMS text to sign language that is displayed by an anime (animated human-like figure) on the iPhone of B (or the iPhone of A that is now being watched by B). 5 - Repeat steps 1 through 4 to facilitate a conversation between A and B. I hope you can suggest modifications or enhancements to this scenario that would enable a similar application to facilitate the instruction of deaf or hard-of-hearing students. For example, the anime could be displayed on a large TV screen that can be watched by all of the students. The teacher would have to point the iPhone at each student who is signing (or voicing) a response. Are there any modifications that would make the system more suitable for sign language self-instruction? for teaching "baby sign language" to new parents? for use in hospital emergency rooms or other emergency situations? for converting sign language X to sign language Y and vice versa to facilitate communication between people who use different sign languages? etc. Thanks in advance for your input. Israel "izzy" Cohen Petah Tikva cohen.izzy@xxxxxxxxx 972-54-754-2744 ************************************** ** Etni homepage - http://www.etni.org ** post to list - etni@xxxxxxxxxxxxx ** help - ask@xxxxxxxx ** David Lloyd: ETNI founder & manager http://david.greenlloyd.com ***************************************