The Interview Analysis project utilizes computer vision techniques to analyze video interviews, focusing on emotional expressions and eye movement patterns. This project aims to enhance the recruitment process by providing data-driven insights into candidates' non-verbal cues during interviews.
- Emotional Detection: Identifies and analyzes candidates' emotional expressions.
- Eye Movement Tracking: Monitors eye movement patterns to assess candidate engagement.
- Data Visualization: Provides visual representations of emotional responses and eye movement data.
- Python 3.x
- OpenCV
- TensorFlow (or any relevant libraries for your model)
- NumPy
To set up the environment, run the following command:
pip install opencv-python tensorflow numpy
-
Clone the repository:
git clone <repository_url> cd <repository_directory>
-
Place your interview video files in the project directory or provide the path to the video in the code.
-
Update the video file path variable in the code to point to your interview video.
-
Run the script:
python interview_analysis.py
-
The script will process the video and output the analysis results, including detected emotions and eye movement patterns.
To test the project, use a sample video of an interview. The analysis will provide insights based on the candidates' emotional states and engagement levels.
- Implement additional features, such as sentiment analysis on verbal responses.
- Explore advanced techniques for improved accuracy in emotion detection and eye movement tracking.