Control your computer using hand gestures captured by your webcam! This project uses machine learning to recognize hand gestures and translate them into mouse movements, clicks, scrolling, and more.
- Real-time hand tracking using MediaPipe
- Gesture recognition with a trained neural network
- Two control modes:
- Absolute Mode (Desktop): Direct cursor positioning for general desktop use
- Relative Mode (Gaming): Raw mouse input for in-game camera control (e.g., Minecraft)
- Cross-platform support: Works on Windows, macOS, and Linux
- Resizable display window UI overlay window, with feedback showing landmarks and detected gesture
- Global hotkeys to switch control modes and close the program
- ✊ Fist: Stop all actions / neutral state
- ☝️ Point (index finger): Hold left mouse button
- ✌️ Peace (two fingers): Hold right mouse button
- 👍 Thumbs Up: Scroll up
- 👎 Thumbs Down: Scroll down
- 🖐️ Five (open hand): Double left-click
- Python 3.12 (required for MediaPipe compatibility)
- Webcam
-
Clone this repository:
git clone https://github.com/nathancbao/Hand-Computer-Control-Interface.git cd Hand-Computer-Control-Interface -
Install dependencies:
pip install -r requirements.txt
-
If you encounter issues with the keyboard library requiring admin privileges, run your terminal/command prompt as administrator.
Run the application:
python main.py- Alt+R: Toggle between Absolute (Desktop) and Relative (Gaming) mode
- Alt+Q: Quit the application
- Position your hand in the center of the camera frame for best tracking, not too far away and not too close
- Adjust window size/position: The window is resizable/moveable - make it smaller and position it in a corner so you can see it while working
- Gaming mode: Switch to Relative mode (Alt+R) for better camera control in games
Edit computer_controller.py to customize:
Relative Mode (Gaming):
SENSITIVITY(default: 200): Higher = faster camera movementDEADZONE(default: 0.05): Higher = less jittery but less responsiveSMOOTHING(default: 0.3): Lower = smoother, higher = more responsive
Absolute Mode (Desktop):
EXPANSION_FACTOR(default: 1.5): Higher = less hand movement needed to reach screen edgesSMOOTHING(default: 0.25): Lower = smoother cursor movementDEADZONE(default: 2): Minimum pixel movement to register
The project includes a pre-trained gesture classifier. To train your own:
- Navigate to the model directory and open
train.ipynb - Collect gesture data using the data collection mode
- Train the model with your custom gestures
- Replace
keypoint_model.pthwith your trained model
Hand-Computer-Control-Interface/
├── main.py # Main application entry point
├── hand_tracking.py # Hand detection and landmark extraction
├── gestures.py # Gesture definitions
├── computer_controller.py # Mouse and keyboard control logic
├── requirements.txt # Python dependencies
├── model/
│ ├── keypoint_classifier.py # Neural network classifier
│ ├── keypoint_model.pth # Trained model weights
│ ├── train.ipynb # Training notebook
│ └── dataset/ # Training data
└── README.md
Window spawns too small or in wrong position:
- Adjust window size/position in
main.py(lines withcv.resizeWindowandcv.moveWindow)
Mouse movement doesn't work in games:
- Make sure you're in Relative mode (press Alt+R)
- On Windows, install
pywin32for better compatibility - Some games with anti-cheat may block input automation
Global hotkeys don't work:
- The
keyboardlibrary may require administrator/root privileges - Try running the script with elevated permissions
Camera not detected:
- Check if another application is using the webcam
- Try changing the camera index in
main.py:cv.VideoCapture(0)→cv.VideoCapture(1)
This project is open source and available for educational purposes.
- Hand tracking powered by MediaPipe
- Built with OpenCV, PyTorch, and PyAutoGUI

