This project uses gestures to control a robotic gripper. The gestures are captured by a webcam. They are classifyed into basic commands and then transmitted via USB to a Arduino board, that translates them into action. The gestures are trainned by a deep learning neural network. The resulting model is used to predict the meaning of a gesture.
To learn more about this project, it is strongly suggested to visit the Wiki. This project has all documentation on it's wiki pages!
You can visit the project main page
brasileiros: visitem wiki brasileiro