The AI coding agent that gets better the more you use it.
CozyTerm is an open-source terminal coding assistant with a multi-model architecture, LSP integration, and a self-improving training pipeline called Forge. Every session trains your local models. Your code makes them smarter.
Built from scratch — no third-party agent frameworks.
curl -fsSL https://cozyterm.com/install.sh | bashOr with Homebrew:
brew install engindearing/tap/cozytermThe installer handles Bun, clones the repo, and puts cozy in your PATH. Then pull some models:
ollama pull qwen2.5:7b-instruct # orchestrator (tool calling)
ollama pull llama3.2 # chatcozy # interactive TUI
cozy "fix the login bug" # one-shot mode
cozy --plan # read-only analysis (no file changes)
cozy /review # run a custom command
cozy models # check model availabilityThe engine runs in the background — watching repos, running tasks autonomously, creating PRs for you to review, and sending notifications. Every action feeds training data back to Forge.
cozy engine install # install as launchd service
cozy engine start # start the daemon
cozy engine status # check if it's running
cozy engine logs # tail engine output
cozy engine stop # stop the daemonEach task is routed to the right specialist:
| Role | Default Model | Purpose |
|---|---|---|
| Orchestrator | qwen2.5:7b-instruct | Tool calling and agent loop |
| Coder | engie-coder:latest | Code generation (Forge-trained) |
| Reasoner | glm-4.7-flash | Analysis, planning, debugging |
| Chat | llama3.2 | Conversation, quick answers |
All models run locally via Ollama. Swap any model in .cozyterm.json. Falls back to Anthropic API when Ollama is unavailable.
The agent has 8 built-in tools:
| Tool | What it does |
|---|---|
read |
Read files with line numbers |
write |
Create or overwrite files |
edit |
Find-and-replace with uniqueness check |
bash |
Run shell commands (with safety blocks) |
glob |
Find files by pattern |
grep |
Search file contents |
undo |
Revert file changes |
diagnostics |
Check LSP errors after edits |
Plus any tools discovered via MCP servers defined in your config.
Every coding session is captured as training data. Forge fine-tunes your models and deploys better versions automatically.
cozy forge status # training stats and collector data
cozy forge train # run full pipeline: prepare → train → deploy → eval
cozy forge eval # benchmark against previous versions
cozy forge versions # model version history
cozy forge rollback # revert to previous versionThe pipeline: Collect → Prepare → Train (LoRA) → Deploy (quantize + register in Ollama) → Eval (benchmark).
- LSP integration — auto-detects TypeScript, Python, Go, Rust. Self-corrects after edits.
- Permission system — interactive prompts for risky tools. Allow once, always, or deny forever.
- Custom commands — markdown files in
.cozyterm/commands/. Run with/review,/test, etc. - Session persistence — SQLite-backed conversation history with file edit snapshots.
- MCP support — connect external tools via Model Context Protocol.
- Themes — cozyterm (warm amber), dracula, tokyonight. Switch with
--theme.
packages/
cli/ Interactive coding agent TUI
engine/ Always-on brain (daemon, gateway, memory, channels)
trainer/ Forge pipeline (collect, prepare, train, deploy, eval)
shared/ Constants, types, theme
site/ cozyterm.com
install.sh One-line installer
Project config: .cozyterm.json in your project root.
Global config: ~/.cozyterm/config.json.
Custom commands: .cozyterm/commands/*.md (project or global).
MIT