Welcome to project tutorial on Hand Gesture Classification Using Python. The goal of this project is to train a Machine Learning algorithm capable of classifying images of different hand gestures, such as a fist, palm, showing the thumb, and others. This classification can be useful for Gesture Navigation, for example.
DATASET
Hand gesture recognition database is presented, composed by a set of near infrared images acquired by the Leap Motion sensor. The database is composed by 10 different hand-gestures (showed above) that were performed by 10 different subjects (5 men and 5 women).
Firstly, we have to import a few python packages which will be needed to work with images and arrays.
LOAD DATA
With the above dataset at hand, we now start preparing the images to train the models. We have to load all the images into an array that we will call X. And all the labels into another array called y. The array Z contains the images as it is in the dataset. While the array X contains the binary image of the images present in Z.
now that we have converted all the pixels into corresponding numbers. All our images are in a multidimensional arrays so we have to flatten the arrays to proceed further. Numpy package helps us with a function called flatten().
Principal Component Analysis and Pre-Processing
Principal Component Analysis (PCA) is used to explain the variance-covariance structure of a set of variables through linear combinations. It is often used as a dimensionality-reduction technique. We use this technique and reduce the number of dimensions that are present in our data.
Reducing the number of dimensions to 20 which leads to,
Now Normalize the data to make sure different features take on similar range of values, For this purpose we use StandarScaler().
Now the training and testing data are normalized. Hence we can start training different models to classify the hand gestures.Stochastic Gradient Descent. Here we use the âLOGâ loss function as a parameter
RESULTS
- Stochastic Gradient Descent : 70.3%
- Decision Tree : 95%
- Random Forest : 99.925%
- Logistic Regression : 72.2%
- Gaussian Naive Bayes : 65.6%
- Gradient Descent : 23.6%
CONCLUSION
Based on the results presented above, we can conclude that one of the classifiers is able to accurately classify the gestures with an accuracy of 99.925%. It based on a Random Forest Classifier algorithm.
The Accuracy of the model is based on many aspects in our dataset. Also the features present in the training data. The dataset was created without any moise i.e, the gestures presented are reasonably distinct, the images are clear and without background. Also there were enough number of samples which made our model robust.
The drawback is that for different problems, we would probably need more data to update the parameters of our model into a better direction. Because of the chaos and noise in the real world scenario we need more noisy data that resembles the real world.
CITATION
T. MantecĂłn, C.R. del Blanco, F. Jaureguizar, N. GarcĂa, âHand Gesture Recognition using Infrared Imagery Provided by Leap Motion Controllerâ, Int. Conf. on Advanced Concepts for Intelligent Vision Systems, ACIVS 2016, Lecce, Italy, pp. 47â57, 24â27 Oct. 2016. (doi: 10.1007/978â3â319â48680â2_5)
Topics you might be intrested
This page is contributed by Shanmukha . If you like AIHUB and would like to contribute, you can also write an article & mail your article to itsaihub@gmail.com . See your articles appearing on AI HUB platform and help other AI Enthusiast.
required source code
there is github link