Estimate alphabet hand gesture using MediaPipe + KNN algorithm. This repository contains :
- Dataset from link which already converted into landmark.
- Dataset generator (generate landmark dataset from images).
- Notebook for alphabet recognition
- anaconda link
- mediapipe 0.8.1
- opencv 3.4.2 or Later
- scikit-learn 0.23.2 or Later
- matplotlib 3.3.2 or Later
│ Generate Dataset.ipynb
│ readme.md
│ Visual Programming Project.ipynb
│
├───.ipynb_checkpoints
│ Generate Dataset-checkpoint.ipynb
│ Generate Test-checkpoint.ipynb
│ Visual Programming Project-checkpoint.ipynb
│
├───archive
│ └───asl_alphabet_train
└───Dataset
hand_dataset.csv
hand_dataset_100.csv
hand_dataset_1000_24.csv
hand_dataset_1000_Z.csv
hand_dataset_100_24_Z.csv
Generating dataset from image to landmark in csv. (if you want to create your own landmark dataset)
Training model and recognize alphabet gestures.
Unzipped folder of image dataset. Make sure child directory is the same as above, if you want to use dataset generator instantly.
- hand_dataset_x_24, contain 24 alphabet dataset (ignoring moving gesture alphabet)
- hand_dataset_x_24_Z, contain 24 alphabet dataset with Z landmark (check here for Z landmark)
Make sure you have the same folder structure as mention above and the same dataset, after that you can run Generate Dataset.ipynb. You might need to modify Generate Dataset.ipynb, if you are using your own dataset.
- Importing libraries and defining dataset.
- Creating Train and Test Data.
- Creating classifier model for our alphabet recognition.
- Calculate model accuracy.
- Show graph for adjusting number of n_neighbors.
- Intialize Mediapipe Hands for alphabet recognition.