Researchers from Microsoft Asia and the Institute of Computing Technology at the Chinese Academy of Sciences developed a computer system able to translate gestures used in sign language to text.
Researchers using Kinect to allow deaf people to communicateStaff writer ▼ Tuesday July 23, 2013 9:20AM ET
The teams working together presented the results of their research at the Faculty Summit 2013, a conference organized by Microsoft to promote information technology sharing among the academic community.
People that are deaf or hard of hearing can simply type words and sentences using a keyboard and read those typed to them. Those that are hearing impaired would like to speak in their native language using a computer just as much as non-hearing impaired people. To date, most such efforts have been less than successful and neither approach has proven to be practical. So, the researchers turned to Microsoft's Kinect device.
Members of the teams demonstrated their system at the DemoFest portion of the conference, showcasing software that has been developed for the Kinect that successfully translates American Sign Language (ASL) into text. The system developed by the team operates in two modes. The first, called simply Translation Mode, translates physical hand or body movements into text or speech.
The Communication Mode, allows a person speaking in ASL to communicate with someone else who is communicating in typed English. The system uses an avatar to translate text coming from someone typing text on a keyboard, then converts their response to text and sends it back to the other person. Their demonstration showed that the system is capable of translating sentences, not just words, a significant step forward. ■