top of page
Slide5.JPG
Ostend

Providing real-time translation of sign language to text/speech for the deaf - all achieved by machine vision.

Background

The App serves as the culmination of my experience during the Design Studio course at University, which required each student to come up with a new start-up idea that would be feasible, viable and practical. 


The concept was that the App could be used to empower Hong Kong's deaf community by giving them the freedom to communicate whenever and wherever.

Overview

Time

  • May 2020

  • Junior year University project "Design Studio"

Role

  • UX/UI Design

Tool

  • Adobe XD

  • Microsoft Powerpoint

New Microsoft PowerPoint Presentation.jp

How might we help the Deaf feel understood (again)?

For the deaf, communicating with hearing people have always been a grave challenge; this is especially so if there were insufficient interpreters. In Hong Kong, the interpreter-to-user ratio was 1:10000 in 2013 and 1:3000 in 2018.
The difficulty in communication presents further challenges in securing education and employment. As of 2014, 47.2% of the deaf people were unemployed, and those fortunate enough were limited to working in roles which required little communication with customers, such as baristas and dishwashers. 
Their access to education was also restricted due to the prevalence of oralist education system; only 6.1% of deaf people had access to tertiary education in Hong Kong as of 2014, which only fueled the vicious cycle towards low life quality for the community.

What's wrong with what we have now?

With a sizable market, the deaf community do have options when it comes to online interpreting. These options are based upon a platform model where the users can be connected to an interpreter who is reserved in advance. 
However, such models are found to be inconvenient and inflexible, mainly due to the specialization of interpreters and the need to reserve an interpreter in advance.

New Microsoft PowerPoint Presentation.jp
How it works

The App's main aim is to provide real-time, sign-to-text/speech & speech-to-text communication, by using AI to track movements of a deaf user's fingertips and recognizing the "shapes" of the user's hands to produce interpreted text on the screen. Theoretically, by using a similar AI algorithm, such functions will also be available as add-ons in video-conferencing applications on desktops and tablets.
With the App, the deaf can now be free of the communication restraints caused by the lack of interpreters.

Potential Add-ons

Other features were considered, but were ultimately dropped due to limited viability and feasibility. They include: 

  • Fig. 1-3: music interpretation (visualization): the concept was to identify a piece of music and visualize it using patterns and colours, so as to provoke emotions similar to watching a real-life music interpretation..

  • Fig. 4: smart home control. This feature is for deaf people who live in a smart home. By interpreting sign language, the user can control IoT devices such as lights.

  • Fig. 5: job portal to support employment for the deaf by allowing them to view job listings from deaf-friendly companies.

New Microsoft PowerPoint Presentation.jp
Notes

This individual project was completed during the finals week, during which my attention was required for 5 other courses in the semester. 
While I am proud of this idea (especially after learning from TV that a similar idea was being developed by the Government ), I am definitely regretful that I could not spare enough time to conduct further research, specifically primary research since the secondary research used in this project was not enough to generate powerful insights, which could have spawned a better solution. Regardless, I have learnt more about a community that is often ignored by the society, and at least partially understood the hardships that deaf people have to endure.

bottom of page