#SignComm
#App URL : http://localhost:8000/SignCommController.html This needs to be localhost as of now, because the leapmotion sensor is connected to the local machine.
##Abstract ###Problem:
Natural Language channel such as words,script or body language such as gestures(hand or head),facial expression and lip motion are the main means of interaction for human beings.These languages are easy to understand for many but may be difficult for people having speech and hearing disability.And therefore a need of special language for these disabled people is needed. But normal people face difficulty in understanding these sign languages.
###Solution:
We are going to implement an user interface which takes signs as input from the disabled people either by capturing the images or by sensors.These inputs are interpreted using existing algorithms and processed as symbols of a language from our database.We are going to take American Sign Language as reference.The user will be able to understand with the help of these symbols.And we name our application as "SignComm".
##User Stories
- As a doctor, I want to provide a proper medication to the patients by understanding their problems correctly which can be provided by using the sign language converter.
- As a user, I want people around me to understand my sign language by translating it into readable note.
- As a user I want to understand what a person with speech disability wants to communicate.
##Personas:
- Speech Impediment
- Normal public
##Architecture Diagram
##Project Report https://github.com/SJSU272Lab/SignComm/blob/master/FinalProject/SignCommReport-2.pdf
##Team
https://github.com/ramyamariappan
https://github.com/Apoorva-Davu
https://github.com/aaamani
https://github.com/saisahitya