Gesture Controlled Virtual Mouse makes human computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of the state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It uses MediaPipe and OpenCV. It consists of two modules: One which works direct on hands by making use of MediaPipe Hand detection, and other which makes use of Gloves of any uniform color. Currently it works on Windows platform.
Note: Use Python version: 3.8.0
### Gesture Recognition:
Move Cursor
Cursor is assigned to the midpoint of index and middle fingertips. This gesture moves the cursor to the desired location. Speed of the cursor movement is proportional to the speed of hand.
Scrolling
Scroll Up
Scroll Down
Dynamic Gestures for horizontal and vertical scroll. The speed of scroll is proportional to the distance moved by pinch gesture from start point. Vertical and Horizontal scrolls are controlled by vertical and horizontal pinch movements respectively.
Drag and Drop
Drag
Drop
Gesture for drag and drop functionality. Can be used to move/tranfer files from one directory to other.
- Opening VIRTUAL MOUSE APP
- Get Current Data and time.
- Search on webpage
- Search in Google Maps
- Open and move in File Structure of Local PC.