The AI Hand Gesture-Controlled Virtual Mouse is a project where a computer's mouse functions are replaced by gestures, enhancing accessibility and the user experience. Utilizing a camera and machine learning algorithms, the system interprets hand gestures, translating them into mouse commands.The Virtual Mouse hand gesture recognition program continually captures real-time images, subjecting them to a series of cleaning and conversion processes. After completion, the program applies image processing techniques to extract the coordinates of the targeted hand gestures from the converted frames. Subsequently, it compares the detected gestures with a predefined list, where different gesture combinations correspond to distinct mouse functions. If a match is found, the program executes the associated mouse function, translating it into an actual command for the user's machine. Users can navigate, click, and scroll without physical contact. It leverages AI, fostering immersive human-computer interaction.
Gesture Recognition: Recognizes a variety of predefined hand gestures for controlling the virtual mouse.
Real-time Feedback: Provides immediate visual and/or auditory feedback to users for successful gesture recognition.
Customization: Allows users to customize gestures, sensitivity, and other settings to suit individual preferences.
Compatibility: Works seamlessly across major operating systems, including Windows, macOS, and Linux.
Move Mouse: Move your hand in the camera's view to move the mouse pointer.
Left Click: Hold your hand in a fist, then release to perform a left-click.
Right Click: Hold your hand with two fingers extended (like a peace sign) to perform a right-click.
Double Click: Perform two quick fist gestures to emulate a double-click.
Scroll: Rotate your hand in a circular motion to scroll up and down.
-> Python 3.x
-> OpenCV
-> Mediapipe
-> Numpy
-> PyAutoGUI
> A webcam (internal or external)