Skip to content

The Gesture Recognition Project aimed to develop an advanced system for accurately interpreting and recognizing human gestures. Led by Anvar, the team consisted of ML engineers Dmitrii and Albert. The project sought to create a versatile solution applicable in various contexts, emphasizing high accuracy and real-time responsiveness.

Notifications You must be signed in to change notification settings

KekStroke/gesture_recognizer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Team members

Anvar Iskhakov
email: [email protected]

Albert Khazipov
email: [email protected]

Dmitrii Naumov
email: [email protected]

Prerequisites

Dependencies

Install all required packages with

pip install -r requirements.txt

Datasets

Original dataset

Download dataset from https://www.kaggle.com/datasets/risangbaskoro/wlasl-processed

And unzip in folder data/raw/dataset

Custom dataset

If you want to create your own dataset you can record videos for your own classes and put them in data/raw/custom_video_dataset. Video of each class should be located in the folder of it's class name. E.g. videos for class plane should be stored in data/raw/custom_video_dataset/plane

Prepare Data

To prepare wlasl dataset for further pre-processing enter following command for the repository root:

python src/data/dataset_preprocessing.py 

To prepare custom dataset for further pre-processing enter following command for the repository root:

python src/data/custom_dataset_preprocessing.py 

To pre-process merged dataset to further training enter following command for the repository root:

python src/data/video_keypoints_extractor.py 

Train model

To train any model on the preprocessed dataset enter following command for the repository root (example for a simple lstm model):

python src/models/simple_lstm/train_model.py

Inference

To use the final trained model on your own videos enter following command for the repository root (example for a simple lstm model):

python src/models/simple_lstm/predict_model.py --file_path 'path/to/video.mp4'

Miscellaneous

One can try models in action (live) by running the following command:

python demo.py

One can add some arguments as well. Command with default arguments is (example for a simple lstm model):

python demo.py --threshold 0.9 --checkpoint_name 'best.pt' --model_name 'simple_lstm' 

About

The Gesture Recognition Project aimed to develop an advanced system for accurately interpreting and recognizing human gestures. Led by Anvar, the team consisted of ML engineers Dmitrii and Albert. The project sought to create a versatile solution applicable in various contexts, emphasizing high accuracy and real-time responsiveness.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 93.0%
  • Python 7.0%