Skip to content

[FEATURE] #4

@BradKML

Description

@BradKML

💡 Feature Description

A clear and concise description of the feature you'd like to see.
Allowing Gemini Code Flow (and Gemini CLI) to use OpenRouter or other OpenAI-API-compatible LLM routing tools (LiteLLM is BYOK, Ollama requires local GPU), such that people are not fully bound by Google.

🤔 Problem or Use Case

Is your feature request related to a problem? Please describe.
Assume that Gemini is not universal across the world, either due to locational constraints or financial constraints. We know that certain models are both cheaper and sometimes better than Gemini (DeepSeek and Qwen), so having options would be a good idea to access Gemini Code Flow https://livebench.ai/#/

💭 Proposed Solution

Possible adoption of the following two tools

🔄 Alternatives Considered

✨ Additional Context

Similar questions clduab11/gemini-flow#7

🎯 Impact

Who would benefit from this feature?

  • New users
  • Experienced users
  • Developers
  • Documentation
  • Performance
  • Other: Geoblocked users + Token-poor users

Thanks for helping make Gemini Code Flow better! 🚀

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions