Medtech News Zimbabwe

Subscribe

Advertise your job ad
    Search jobs

    New mobile apps being developed for the deaf

    Researchers at the University of the Western Cape in South Africa are developing mobile applications that will revolutionise the way deaf and hearing people communicate. They are working on several interconnected projects and programmes focusing on cellphone or mobile applications with the ability to translate spoken language into sign language and vice versa.
    Image courtesy of KROMKRATHOG /
    Image courtesy of KROMKRATHOG / FreeDigitalPhotos.net

    The researchers include the Telkom/Cisco/Aria Technologies/THRIP Centre of Excellence for IP and Internet Computing, scientists from Bridging Application and Network Gaps (BANG) and Integration of Signed and Verbal Communication: South African Sign Language Recognition, Animation and Translation (SASL) groups in the UWC computer science department. The computer science department uses the capital D for deaf to denote a cultural grouping, one that primarily uses South African sign language to communicate.

    Estimates vary but it is believed that there are up to 1.5 million people with hearing disabilities in South Africa, a significant part of the population that is frequently overlooked as a potentially economically active workforce.

    SignSupport to assist in day-to-day applications

    Head of the Computer Science Department, Associate Professor Isabella Venter says more than 45 postgraduate students are currently working on projects relating to technology applications. She highlights a video application called SignSupport for cellphones, in which pre-recorded videos for specific situations are loaded onto the phones, eg enabling deaf people to interact with a pharmacist and to get medicine without the help of an interpreter or third person.

    A key reason for the success of these projects, Professor Venter believes, is that they are inclusive - involving large industry partners and small NGOs as well as active engagement with ordinary South Africans with hearing impairments.

    BANG leader and UWC Associate Professor in Computer Science, William Tucker says that their research and collaboration with a local deaf community has helped uncover the real needs of deaf people. One of the big misconceptions about deaf people is that they are able to read and write. However, many deaf people are functionally illiterate and unable to read, literate only in sign language.

    "For years, scientists have been building things for deaf people that they don't need. Often it was a case of computer geeks looking for problems they could think up cool solutions for, but which weren't necessarily helpful to people in the real world."

    Many deaf people are also under-employed, poor and unable to read lips or afford cochlear implants.

    "Many technology applications for deaf people are based on text, but SignSupport is video-based to make it more accessible. So far, the deaf people we have involved in the research think it's great," says Tucker. SignSupport will be taken to the next level in 2014, when it is implemented at selected pharmacies, following permission from the Department of Health.

    Translating South African Sign Language, English

    In parallel with this, the SASL group led by James Connan and Mehrdad Ghaziasgar is working on a system to translate between South African Sign Language and English. Research has focused on recognising sign language by breaking gestures down into their various components, including facial expression recognition, hand shape estimation, hand tracking and various others. Work has also been done on rendering sign language using 3D avatars. This work will eventually be integrated into a fully-fledged translation system that will allow deaf and hearing users to communicate using mobile phones.

    The hearing user will use the camera on their phone to record a person using sign language and the phone will translate and play back the English speech. In the reverse, the deaf user will use the microphone on the mobile phone to record what is being said in English and a 3D avatar on the display of the phone will render the sign language.

    Microsoft research recently made headlines when it announced a similar system for Chinese Sign Language. The primary difference between its system and the local system is that the Microsoft system is based around its proprietary Kinect sensor hardware, while the local system is built using generic mobile phone technology.

    Cellphones offer universal access

    Natural Sciences Faculty Dean Michael Davies-Coleman says these flagship programmes show how scientists can engage directly with society and improve lives. The mobile technology applications have been designed in such a way that they can be used by people in other countries and enable oral recordings, making it internationally competitive - but it is right here in the developing world where the technologies stand to have the greatest impact.

    "Cellphones have become ubiquitous, even in poor communities. Therefore, this has implications for especially rural areas, where people have amazing phones but where literacy levels are low. As one goes north on the continent, this will probably only become more pronounced," says Tucker.

    This is where the UWC has real potential to make a difference. While considerable work is being done all over the world regarding technology applications for deaf people, there is very little for people with disabilities in developing regions. "We have become known world-wide for the work we are doing now," says Tucker.

    Professor Venter says their research has already piqued the interest of American researchers who are keen to use the UWC-developed technology in their work. "Alternative forms of communication as demonstrated by these projects can make a real and measurable difference for individuals and more broadly for economies and countries," she concludes.

    For more information, go to http://repository.uwc.ac.za/xmlui/handle/10566/51.

    Let's do Biz