Skip to content

MarkCodering/cortex-cli

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Cortex CLI

Cortex CLI CI Version License

Cortex CLI Screenshot

Note: Cortex CLI is a fork of Google's Gemini CLI with additional Ollama support and enhanced features.

Cortex CLI is an open-source AI agent that brings the power of multiple AI providers directly into your terminal. It provides lightweight access to Gemini, Ollama, and other AI providers, giving you the most direct path from your prompt to various AI models.

πŸš€ Why Cortex CLI?

  • 🎯 Multiple AI Providers: Support for Gemini, Ollama, and more
  • 🧠 Powerful Models: Access to Gemini 2.5 Pro, Llama, CodeLlama, and local models via Ollama
  • πŸ”§ Built-in tools: Google Search grounding, file operations, shell commands, web fetching
  • πŸ”Œ Extensible: MCP (Model Context Protocol) support for custom integrations
  • πŸ’» Terminal-first: Designed for developers who live in the command line
  • πŸ›‘οΈ Open source: Apache 2.0 licensed (forked from Google's Gemini CLI)

πŸ“¦ Installation

Quick Install

Run instantly with npx

# Using npx (no installation required)
npx https://github.com/MarkCodering/cortex-cli

Install globally with npm

npm install -g @markcodering/cortex-cli

Install globally with Homebrew (macOS/Linux)

# Note: Homebrew formula coming soon
# brew install cortex-cli

System Requirements

  • Node.js version 20 or higher
  • macOS, Linux, or Windows

Release Cadence and Tags

See Releases for more details.

Preview

New preview releases will be published each week at UTC 2359 on Tuesdays. These releases will not have been fully vetted and may contain regressions or other outstanding issues. Please help us test and install with preview tag.

npm install -g @google/gemini-cli@preview

Stable

  • New stable releases will be published each week at UTC 2000 on Tuesdays, this will be the full promotion of last week's preview release + any bug fixes and validations. Use latest tag.
npm install -g @google/gemini-cli@latest

Nightly

  • New releases will be published each week at UTC 0000 each day, This will be all changes from the main branch as represented at time of release. It should be assumed there are pending validations and issues. Use nightly tag.
npm install -g @google/gemini-cli@nightly

πŸ“‹ Key Features

Code Understanding & Generation

  • Query and edit large codebases
  • Generate new apps from PDFs, images, or sketches using multimodal capabilities
  • Debug issues and troubleshoot with natural language

Automation & Integration

  • Automate operational tasks like querying pull requests or handling complex rebases
  • Use MCP servers to connect new capabilities, including media generation with Imagen, Veo or Lyria
  • Run non-interactively in scripts for workflow automation

Advanced Capabilities

  • Ground your queries with built-in Google Search for real-time information
  • Conversation checkpointing to save and resume complex sessions
  • Custom context files (GEMINI.md) to tailor behavior for your projects

GitHub Integration

Integrate Gemini CLI directly into your GitHub workflows with Gemini CLI GitHub Action:

  • Pull Request Reviews: Automated code review with contextual feedback and suggestions
  • Issue Triage: Automated labeling and prioritization of GitHub issues based on content analysis
  • On-demand Assistance: Mention @gemini-cli in issues and pull requests for help with debugging, explanations, or task delegation
  • Custom Workflows: Build automated, scheduled and on-demand workflows tailored to your team's needs

πŸ” Authentication Options

Choose the authentication method that best fits your needs:

Option 1: OAuth login (Using your Google Account)

✨ Best for: Individual developers as well as anyone who has a Gemini Code Assist License.

Benefits:

  • Free tier: 60 requests/min and 1,000 requests/day
  • Gemini 2.5 Pro with 1M token context window
  • No API key management - just sign in with your Google account
  • Automatic updates to latest models

Start Cortex CLI, then choose OAuth and follow the browser authentication flow when prompted

cortex

If you are using a paid Code Assist License from your organization, remember to set the Google Cloud Project

# Set your Google Cloud Project
export GOOGLE_CLOUD_PROJECT="YOUR_PROJECT_NAME"
cortex

Option 2: Gemini API Key

✨ Best for: Developers who need specific model control or paid tier access

Benefits:

  • Free tier: 100 requests/day with Gemini 2.5 Pro
  • Model selection: Choose specific Gemini models
  • Usage-based billing: Upgrade for higher limits when needed
# Get your key from https://aistudio.google.com/apikey
export GEMINI_API_KEY="YOUR_API_KEY"
cortex

Option 3: Vertex AI

✨ Best for: Enterprise teams and production workloads

Benefits:

  • Enterprise features: Advanced security and compliance
  • Scalable: Higher rate limits with billing account
  • Integration: Works with existing Google Cloud infrastructure
# Get your key from Google Cloud Console
export GOOGLE_API_KEY="YOUR_API_KEY"
export GOOGLE_GENAI_USE_VERTEXAI=true
cortex

Option 4: Ollama (Local AI)

✨ Best for: Privacy-focused developers, local development, and custom models

Benefits:

  • Complete privacy: All processing happens locally
  • No API costs: Free to use with your own hardware
  • Custom models: Use any model supported by Ollama
  • Offline capable: Works without internet connection
# Install Ollama first: https://ollama.ai/
# Then pull a model (e.g., llama2, codellama, etc.)
ollama pull llama2

# Start Ollama server
ollama serve

# Use with Cortex CLI
export CORTEX_AUTH_TYPE=ollama
export OLLAMA_BASE_URL=http://localhost:11434  # optional
cortex -m llama2

For Google Workspace accounts and other authentication methods, see the authentication guide.

πŸš€ Getting Started

Basic Usage

Start in current directory

cortex

Include multiple directories

cortex --include-directories ../lib,../docs

Use specific model

cortex -m gemini-2.5-flash
# Or use Ollama model
cortex -m llama2

Non-interactive mode for scripts

cortex -p "Explain the architecture of this codebase"

Quick Examples

Start a new project

cd new-project/
cortex
> Write me a Discord bot that answers questions using a FAQ.md file I will provide

Analyze existing code

git clone https://github.com/MarkCodering/cortex-cli
cd cortex-cli
cortex
> Give me a summary of all of the changes that went in yesterday

Using Ollama for local AI

# Make sure Ollama is running locally
ollama serve

# In another terminal, use Cortex CLI with Ollama
export CORTEX_AUTH_TYPE=ollama
export OLLAMA_BASE_URL=http://localhost:11434  # optional, this is the default
cortex -m llama2
> Help me refactor this function to be more efficient

πŸ“š Documentation

Getting Started

Core Features

Tools & Extensions

Advanced Topics

Configuration & Customization

Troubleshooting & Support

  • Troubleshooting Guide - Common issues and solutions
  • FAQ - Quick answers
  • Use /bug command to report issues directly from the CLI

Using MCP Servers

Configure MCP servers in ~/.gemini/settings.json to extend Gemini CLI with custom tools:

> @github List my open pull requests
> @slack Send a summary of today's commits to #dev channel
> @database Run a query to find inactive users

See the MCP Server Integration guide for setup instructions.

🀝 Contributing

We welcome contributions! Cortex CLI is fully open source (Apache 2.0), forked from Google's Gemini CLI, and we encourage the community to:

  • Report bugs and suggest features
  • Improve documentation
  • Submit code improvements
  • Share your MCP servers and extensions
  • Add support for additional AI providers

See our Contributing Guide for development setup, coding standards, and how to submit pull requests.

πŸ“– Resources

Uninstall

See the Uninstall Guide for removal instructions.

πŸ“„ Legal


Built with ❀️ by the community, forked from Google's Gemini CLI

About

An open-source AI agent that brings the power of open-source models to your terminal

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 98.4%
  • JavaScript 1.6%