In this Project I havr created a Virtual AI Keyboard that tracks your hand gestures and positions to execute the type command.
- cvzone (comes with numpy and opencv)
- mediapipe (backend for our handtracking module).
- opencv-python
- numpy
- cvzone
- pynput
- Uses OpenCV and MediaPipe to detect and track the hand using the "Hand Landmark Model".
- OpenCV is a library used for computer vision applications. With help of OpenCV, we can build an enormous number of applications that work better in real-time. Mainly it is used for image and video processing.
- MediaPipe is a framework mainly used for building audio, video, or any time series data. With the help of the MediaPipe framework, we can build very impressive pipelines for different media processing functions.
- Hand Landmark Model
- In order to simulate a click, you need to connect the index and middle fingers on your hand