AIJobMate is a prototype application for generating UK-style CVs and cover letters using structured personal data and multiple AI agents orchestrated via CrewAI.
AIJobMate works in two main steps:
- 
Build Profile
Enter your professional background in freeform text. The system extracts structured data (experience, skills, education, etc.) and stores it indata/profile.json. - 
Generate CV & Cover Letter
Provide a job description and select which model to use for each agent:- 🧑💼 CV Writer
 - 📝 Cover Letter Specialist
 - ✅ Quality Assurance Reviewer
 
 
Each agent uses a selected model (e.g. llama3.2, llama3.1) and contributes to creating well-structured documents.
- Structured profile generation from natural language input
 - CrewAI-powered workflow with distinct agent roles
 - Independent model selection per agent
 - Quality review step before final output
 - Ready to run locally with Ollama
 - Modular codebase for future expansion
 
git clone https://github.com/loglux/AIJobMate.git
cd AIJobMatepython -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
pip install -r requirements.txtollama servePull models you want to use, for example:
ollama pull llama3.2python gui.pyThe Gradio interface has two tabs:
Input: Your background in plain English — this can be copied from your CV, or written freely with comments and additional details.
Output: JSON profile stored in data/profile.json
Example:
I’ve worked in IT operations for several years, including network monitoring, scripting with Python, and automation using Docker. My most recent role involved collaboration with cross-functional teams. I’d like to highlight my ability to solve complex technical problems and communicate with both technical and non-technical stakeholders.
P.S.
🛠️ The Profile Builder currently uses the llama3.2 model by default to extract structured JSON data from free-form text input.
Input:
- A job description
 - Selected models for each agent
 
Output:
- CV
 - Tailored cover letter
 - QA-reviewed summary
 
.
├── gui.py                  # Gradio interface
├── career_crew.py          # CrewAI agent and task logic
├── profile_manager.py      # Handles reading/writing profile JSON
├── llm_engines/
│   └── ollama_client.py    # Low-level client for Ollama models
├── data/
│   └── profile.json        # Generated user profile
├── requirements.txt
└── README.md
This is a functioning prototype and will continue to evolve. Expect future enhancements including:
- Editing specific sections of the profile
 - Adding custom categories (e.g. Projects, Publications)
 - Exporting to PDF or Word
 - Support for additional LLM providers (OpenAI, DeepSeek, Mistral, etc.)
 
You can use any local model available in your Ollama setup.
Examples:
llama3.2llama3.1- Any other model pulled via 
ollama pull 
No specific model recommendation is made.
MIT

