Skip to content

mitralone/cvoptimizer_backend

Repository files navigation

CV Optimizer Backend

Backend service for an AI-powered CV optimization platform that helps users tailor their resume and generate cover letters for specific job descriptions using large language models.

Users upload their CV and a job description, and the system analyzes the content, generates an optimized version of the CV, and produces a targeted cover letter while tracking AI usage through a credit system.

This repository demonstrates backend engineering depth across architecture, domain modeling, reliability, security, and operational concerns, not just endpoint wiring.

Highlights

  • Modular NestJS architecture with clear bounded contexts (auth, users, credits, files, optimizations, llm, email)
  • Database-first approach with explicit SQL migrations and TypeORM entities (synchronize: false)
  • Credit-ledger accounting model with idempotency support for safe, auditable balance changes
  • JWT + refresh-token auth flow with OTP-based login
  • File ingestion pipeline for PDF, DOCX, TXT, and MD
  • LLM orchestration layer with provider abstraction (OpenAI integration designed to remain vendor-agnostic)
  • Structured validation, error handling, and configuration layering for production readiness

Tech Stack

  • Runtime: Node.js, TypeScript
  • Framework: NestJS 11
  • Database: MySQL + TypeORM
  • Auth: JWT + Passport
  • AI: OpenAI Node SDK
  • Email: Nodemailer
  • Testing: Jest + Supertest

Architecture

The backend follows a layered architecture typical of production NestJS services.

Request flow:

  1. Controller receives request and validates DTO input.
  2. Global JwtAuthGuard enforces authentication by default (with explicit @Public() exceptions).
  3. Service layer executes domain logic.
  4. Repositories/entities persist state changes in MySQL.
  5. Cross-cutting concerns (global exception filter, validation pipe, config service) apply consistently.

Key structural decisions:

  • Module isolation: each domain owns its controller/service/models.
  • Manual migrations: avoids drift and accidental schema mutations.
  • Ledger-based credits: immutable transaction history for traceability.
  • Provider abstraction for LLM: business logic is decoupled from vendor SDK details.

Domain Model Snapshot

The system is designed around a credit-based AI usage model where LLM operations are metered and auditable.

Core entities include:

  • users
  • otp_codes
  • refresh_tokens
  • optimizations
  • credit_ledger
  • llm_usage
  • emails_sent

This model supports secure auth, metered AI usage, and auditability of user credits.

API Surface (v1)

Base path: /api/v1

  • POST /auth/send-otp
  • POST /auth/verify-otp
  • POST /auth/refresh
  • POST /auth/logout
  • GET /users/me
  • GET /users/stats
  • GET /credits/balance
  • GET /credits/history
  • POST /credits/add
  • POST /files/upload
  • GET /files/:filename
  • POST /optimizations
  • GET /optimizations
  • GET /optimizations/:id

Local Setup

1. Install dependencies

npm install

2. Configure environment

cp .env.example .env

Set required values in .env:

  • DB_HOST, DB_PORT, DB_USERNAME, DB_PASSWORD, DB_DATABASE
  • JWT_SECRET
  • EMAIL_HOST, EMAIL_PORT, EMAIL_USER, EMAIL_PASSWORD, EMAIL_FROM
  • OPENAI_API_KEY

3. Run database migrations

mysql -u root -p cvoptimizer < src/database/migrations/001_initial_schema.sql
mysql -u root -p cvoptimizer < src/database/migrations/002_add_input_file_format.sql

4. Start development server

npm run start:dev

Server: http://localhost:3000/api/v1

Quality and Security Notes

  • Global input validation (whitelist, forbidNonWhitelisted, transform)
  • Global exception filtering for predictable API errors
  • Environment-based configuration via Nest ConfigModule
  • .env and uploaded user files are git-ignored
  • Production deployment should use managed secrets and rotated credentials

Repository Docs

Project and implementation documents are in Docs/.

What This Repository Demonstrates

  • Designing maintainable backend services with clear modular boundaries
  • Modeling transactional business logic with auditability requirements
  • Integrating LLM-powered features while controlling usage and cost
  • Building production-ready APIs with migrations, validation, and structured error handling

Future Improvements

Possible extensions to this system include:

  • background job processing for long-running optimizations
  • streaming LLM responses
  • vector search for resume/job similarity scoring
  • usage analytics and cost monitoring dashboards

About

CV Optimizer Backend - AI-powered CV optimization service with NestJS, TypeScript, OpenAI integration, and credit-based business model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors