Skip to content

Latest commit

 

History

History
30 lines (22 loc) · 1.37 KB

README.md

File metadata and controls

30 lines (22 loc) · 1.37 KB

Virtual-AI-Keyboard

In this Project I havr created a Virtual AI Keyboard that tracks your hand gestures and positions to execute the type command.

Useful Libraries:-

  • cvzone (comes with numpy and opencv)
  • mediapipe (backend for our handtracking module).

Requirements:-

  • opencv-python
  • numpy
  • cvzone
  • pynput

HandTrackingModule:-

  • Uses OpenCV and MediaPipe to detect and track the hand using the "Hand Landmark Model".
  • OpenCV is a library used for computer vision applications. With help of OpenCV, we can build an enormous number of applications that work better in real-time. Mainly it is used for image and video processing.
  • MediaPipe is a framework mainly used for building audio, video, or any time series data. With the help of the MediaPipe framework, we can build very impressive pipelines for different media processing functions.
  • Hand Landmark Model image

Click:-

  • In order to simulate a click, you need to connect the index and middle fingers on your hand

Keyboard Layout:-

keyboard

Demonstration Video:-

Virtual.Keyboard.Demo.Video.mp4