- Abstract
- Motivation
- Problem Statement
- Solution Proposal
- Objective
- Technology Stack
- Use Case Diagram
- Evaluation Metrics
For the people with hearing or speech impairments, it is very difficult to communicate with others who aren’t familiar with the sign languages. Speech and hearing-impairedpeople form a significant part of the society. Also, they can be your loved ones, friends, family or colleagues. People who aren’t familiar with the sign languages cannot understand what the other person using sign language wants to convey.This leads to a large communication gapbetween two equally important communities of the society. This very communication gap then leads to various inequalities and lack of opportunities for such specially-abled people.
To reduce this communication gap & in turn the inequality, we have tried to come up with a solution.Our solution proposal for this issue is to create a platform for users which could interpret and translate the sign language gestures for them. So that the communication gap between community which isn’t aware of the sign languages and community which uses sign languages can be bridged.
The United Nations’ Department of Economic and Social Affairs-Disabilityhas launched a campaign known as ‘Envision2030’.The tagline of Envision 2030 itself is ‘17 goals to transform the world for persons with disabilities’. In September 2015, the General Assembly adopted the 2030 Agenda for Sustainable Development that includes 17 Sustainable Development Goals (SDGs). Building on the principle of “leaving no one behind”, thenew Agendaemphasizes a holistic approach to achievesustainable development for all.
Although, the word “disability” is not cited directly in all goals, the goals are indeed relevant to ensure the inclusion and development of persons with disabilities.
Out of those 17 SDGs, we have chosen to work on SDG 10–‘Reduced Inequality’.In this project, when we refer to the word inequality, we areactuallyreferring to the inequality people with hearing and speech impairments face in our society. One of the factor behind this inequality is the huge communication gap between such specially abled communities and the rest of the communities of the society.This communication gap exists becausethe medium of communication for such specially abled individuals are sign languages. Whereas, many people aren’t familiar with the sign languages. This very communication gap leads to inequality in various circumstances. For an instance, rights of deaf individuals are regularly violated in places such as movie theaters, classrooms, and even online.There are cases where deaf people die after the hospital withholds a critical medical diagnosis.Or perhaps ambitious students are denied access to medical school because the institution doesn’t feel like fulfilling their legal obligation toprovide an interpreter.
In order to reduce this inequality and injustice, we have tried to come up with a solution.
Creating a platform for sign language interpretation and translation.
A web-application wherein when the user pointsthe webcam towards a person performing sign language gestures, the application will interpret the gestures and translate them to the uservia text as well as audio.
Artificial Intelligence isever-evolving and has been widely applied in various domains. Our web-application ‘UNITE MOJO’is based on artificial intelligence and computer vision itself. These technologies form the core of this project. Basically,we have used Artificial Intelligence to create and develop a model which could recognize the sign language gestures and Computer Vision for working on live inputvia webcam and feeding it to the model. That’s the general perspective behind choosing these technologies.
- Programming Language:
- Python
- JavaScript
- HTML5
- CSS
- Framework:
- OpenCV(v4.4.0)
- Tensorflow(v2.3.1)
- Flask(v1.1.2))
- Pyttsx3(v2.7))