Skip to content

Boredooms/Emotion-Detection

Repository files navigation

🎭 Emotion Detection Module 🎭

Welcome to the Emotion Detection module of MockPitch 2! This tool uses your webcam to see your face and guess what you're feeling. Are you happy, sad, surprised, or something else? Let's find out!

🤔 How It Works

This module uses a powerful deep learning model that has been trained on thousands of images of faces. It analyzes your facial expressions in real-time to predict one of seven emotions:

  • 😠 Angry
  • 🤢 Disgust
  • 😨 Fear
  • 😄 Happy
  • 😐 Neutral
  • 😢 Sad
  • 😲 Surprise

🚀 Get Started

Ready to see your emotions come to life? Here's how to get this module running.

1. Open Your Terminal

Navigate to this directory in your terminal:

cd "c:\College Assignment\MockPitch 2\Emotion"

2. Create the Virtual Environment

We need to create a special, isolated space for this project's Python packages.

python -m venv .venv

3. Activate the Environment

Now, let's step into that isolated space.

# On Windows
.venv\Scripts\activate

4. Install the Goodies

Time to install all the necessary libraries. This project has a special requirements2.txt file.

pip install -r requirements2.txt

5. Run the Magic! 🪄

Let's see those emotions! Run the test_model.py script.

python test_model.py

A window will pop up, and the console will show the detected emotion. The script will also log the output to a CSV file.

📊 Logging

This module logs the detected emotions to a file named emotion_output.csv. Each time you run the script, a new log file is created. The CSV file contains the following columns:

  • timestamp: The Unix timestamp of the log entry.
  • emotion: The detected emotion at that timestamp.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages