Run Claude Code CLI for FREE - No paid Anthropic subscription required. Complete setup guide using NVIDIA NIM's free-tier API + LiteLLM proxy.
| Feature | Description |
|---|---|
| 🆓 100% Free | No credit card, no paid subscription |
| ⚡ 15-min Setup | Quick installation guide |
| 🤖 3 Model Slots | Sonnet, Opus, Haiku alternatives |
| 🔒 Secure | Your API keys stay local |
| 📚 Complete Docs | Setup, troubleshooting, tips |
Visit the live documentation site (after deployment):
- 🏠 Homepage - Overview & architecture
- ⚙️ Setup Guide - Step-by-step installation
- 🤖 Model Comparison - Choose your AI model
- 🔧 Troubleshooting - Fix common issues
- 📚 Daily Usage - Workflow commands
- 💡 Power Tips - Pro productivity tricks
- ❓ FAQ - Common questions
# 1. Install Claude Code CLI (macOS/Linux/WSL)
curl -fsSL https://claude.ai/install.sh | bash
# Windows PowerShell: irm https://claude.ai/install.ps1 | iex
# Windows CMD: curl -fsSL https://claude.ai/install.cmd -o install.cmd && install.cmd && del install.cmd
# 2. Get free NVIDIA API key
# Visit: https://build.nvidia.com and create account
# 3. Run LiteLLM in Docker (from your config.yaml directory)
docker run -d --name litellm-nim --restart unless-stopped -p 4001:4000 -e NVIDIA_NIM_API_KEY="nvapi-YOUR_KEY" -v "$(pwd)/config.yaml:/app/config.yaml" docker.litellm.ai/berriai/litellm:main-stable --config /app/config.yaml
# If container already exists: docker rm -f litellm-nim
# 4. Launch Claude through LiteLLM
ANTHROPIC_BASE_URL="http://localhost:4001" ANTHROPIC_API_KEY="sk-litellm-local" ANTHROPIC_MODEL="claude-sonnet-4-6" claude📝 Full guide: See Setup Documentation
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Claude Code │───▶│ LiteLLM │───▶│ NVIDIA NIM │
│ CLI │ │ Proxy │ │ API │
└──────────────┘ └──────────────┘ └──────────────┘
- Claude Code sends requests in Anthropic format
- LiteLLM translates to OpenAI format
- NVIDIA NIM processes with free AI models (Gemma, Nemotron, Mistral)
claude-code-free/
├── src/
│ ├── app/
│ │ ├── page.tsx # Homepage
│ │ ├── setup/ # Step-by-step guide
│ │ ├── models/ # Model comparison
│ │ ├── troubleshooting/ # Error fixes
│ │ ├── daily-usage/ # Workflow commands
│ │ ├── tips/ # Power user tips
│ │ ├── faq/ # FAQs
│ │ ├── sitemap.ts # SEO sitemap
│ │ └── robots.ts # SEO robots.txt
│ ├── components/ # Reusable UI components
│ └── content/ # Site content data
├── public/ # Static assets
├── vercel.json # Vercel deployment config
└── README.md # This file
# Clone the repository
git clone https://github.com/AAYUSH412/claude-code-free
cd claude-code-free
# Install dependencies
npm install
# Run development server
npm run dev
# Build for production
npm run build
# Deploy to Vercel
vercel --prod- Typography: Cormorant Garamond + Instrument Sans
- Grain Overlay: Subtle texture effect
- Minimal Aesthetic: Clean, distraction-free reading
- Dark Mode Ready: Developer-friendly theme
- Smooth Animations: Framer Motion transitions
| Component | Role |
|---|---|
| NVIDIA NIM | Free AI inference API (40 req/min, no credit card) |
| LiteLLM | Translates Anthropic → OpenAI API format |
| Docker | Runs proxy locally, keys stay on your machine |
| Claude Code | Works normally with env variable routing |
- ✅ Your API key stays local (never sent to third parties)
- ✅ Use environment variables (
.zshrc, Docker env) - ✅ Never commit keys to GitHub
- ✅ Regenerate immediately if exposed
❌ Never:
- Paste
nvapi-...keys in public chats - Commit
.envor config files with keys - Share screenshots with API keys visible
Found a bug or want to improve the docs? Contributions welcome!
- Fork the repository
- Create your feature branch (
git checkout -b feature/awesome-feature) - Commit your changes (
git commit -m 'Add awesome feature') - Push to the branch (
git push origin feature/awesome-feature) - Open a Pull Request
- Author: Aayush Vaghela
- GitHub: @AAYUSH412
- Issues: GitHub Issues
MIT License - feel free to use, modify, and distribute.
If this project helped you, consider giving it a star on GitHub! It helps others discover this free resource.
Built with ❤️ by Aayush Vaghela
Made for the developer community - free and open source 🚀