Skip to content

A module to detect hand gestures on live feed of a camera using ML techniques to possibly control and navigate a system interface

Notifications You must be signed in to change notification settings

sahsudhir231/Hand-Motion-Identification-for-System-Navigation-and-Control

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

Hand-Gesture-Recognition or Hand Motion Identification for System Navigation and Control

A real time video feed is analyzed for hand recognition and then trained for gesture recognition made through hands. The live video is fed through camera from devices that are connected to the model such as laptop and phones.The model uses machine learning techniques to train and predict the data i.e hand gestures. Python shall be the dominant programming language used along with Java and web languages for web deployment of the model. The model can also be deployed in Android phones. The output from the model can be used to control and direct various interfaces like mouse in computers, joysticks in gaming and even virtual simulation of the hand.

Detailed Report

INTRODUCTION:

The project is titled “Hand Motion Identification for System Navigation and Control”. In this project, a real time video feed is analyzed for hand recognition and then trained for gesture and motion recognition made through hands. The live video is fed through camera from devices that are connected to the devices such as laptop and phones. The model uses machine learning techniques to train and predict the data i.e. hand gestures. Python shall be the dominant programming language used along with Java and web languages for web deployment of the model. The model can also be deployed in Android phones. The output from the model can be used to control and direct various interfaces like mouse in computers and joysticks in gaming.

OBJECTIVE:

We are going to experiment if various hand gestures and motions can be utilized to control various interfaces on a system. The effectiveness of this method to control various systems will be analyzed. The comparison between various methods of hand recognition, with result being best method suitable for the task in hand will be

STEPS:

The initial data is the video feed from camera captures in certain frame rates suitable. The images fetched from video will go through a series of pre-processing such as grey scale image conversion, image correction and binary image conversion. This data will be used to recognize hand, then motion and gesture detection in the image using various libraries such as OpenCV and Sklearn. This identification data can now be used to control interfaces in various systems by associating various gestures or motion of a hand to actions in various System Interface.

LANGUAGES And LIBRARIES USED:

The programming language to be dominantly used is “Python”, probably version 3. Various libraries such as OpenCV, SkLearn, NumPy etc. will be used with python. The language used to control interfaces in various systems will be “Java”. If we decide to deploy the model using web technologies, various Web Languages can also be used.

About

A module to detect hand gestures on live feed of a camera using ML techniques to possibly control and navigate a system interface

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages