Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for OpenRouter AI #6

Open
valcho opened this issue Feb 7, 2025 · 11 comments
Open

Support for OpenRouter AI #6

valcho opened this issue Feb 7, 2025 · 11 comments

Comments

@valcho
Copy link

valcho commented Feb 7, 2025

Feature Request: OpenRouter AI Integration - One Key for Multiple Models

First of all, thanks for the great project!

Currently, using multiple AI models from different providers (Perplexity, Google) requires registering and managing API keys for each service individually. This can be cumbersome and inefficient.

I propose adding support for OpenRouter AI. OpenRouter acts as a unified API for accessing various AI models through a single API key. This would allow users to:

  • Access multiple models with a single API key: Streamline the process of using different AI models without managing multiple accounts and keys.
  • Simplify configuration: Reduce the complexity of setting up and switching between different AI providers.
  • Potentially reduce costs: OpenRouter offers features like dynamic pricing and routing, which could lead to cost optimization.
  • Future-proof integration: As new models and providers are added to OpenRouter, they would become accessible through this integration.

Is this a good idea? Please provide feedback and discuss the feasibility of implementing this feature.

@eastlondoner
Copy link
Owner

eastlondoner commented Feb 7, 2025

Vote for it here: https://x.com/eastlondondev/status/1887777650225135775?s=46&t=WQgoGAHu8A1Z7ldKlndguw

@eastlondoner eastlondoner marked this as a duplicate of #16 Feb 9, 2025
@eastlondoner
Copy link
Owner

Yes!

I am evaluating OpenRouter and ModelBox for this.

I’ve found that these providers do not always give the same behaviour as the original APIs so I will thoroughly test them first

@kjanoudi
Copy link
Contributor

kjanoudi commented Feb 9, 2025

I'd also propose integrating with OpenAI's api as well, so that we can call o3 models directly - for example, after a gemini analysis of the files to include, we could have a second repomix call that only includes the relevant files and asks o3-mini-high for step by step implementation plan for sonnet to actually perform.

@eastlondoner
Copy link
Owner

yeah this is a separate feature tho for sure

@gitcnd
Copy link

gitcnd commented Feb 10, 2025

I’ve found that these providers do not always give the same behaviour as the original APIs so I will thoroughly test them first

These are the original APIs, that's why it's got "router" in the name. Maybe you picked the wrong provider of the model in your test? AI itself never "gives the same behaviour" (unless you define a starting key and temperature) of course.

@eastlondoner
Copy link
Owner

@gitcnd openrouter transforms everything into an “OpenAI compatible Completion api

Image

Image

These providers are not originally OpenAI compatible.

Can you share an example of calling google Gemini on openrouter using google Gemini’s API SDK ? Or calling Anthropic using Anthropic’s SDK ?

@dperetti
Copy link

dperetti commented Feb 10, 2025

Can you share an example of calling google Gemini on openrouter using google Gemini’s API SDK ? Or calling Anthropic using Anthropic’s SDK ?

You don't, you just set the model to "google/gemini-2.0-pro-exp-02-05:free" or "anthropic/claude-3-5-sonnet-20241022" and call their OpenAI compatible endpoint and here you go.
I've been using openrouter for two weeks and it's fantastic.

@eastlondoner
Copy link
Owner

You don't, you just set the model to "google/gemini-2.0-pro-exp-02-05:free" or "anthropic/claude-3-5-sonnet-20241022" and call their OpenAI compatible endpoint and here you go.

Yes, I understand that, they cannot be "the original APIs" and ALSO be "OpenAI compatible" at the same time - this was my point to the previous poster.

I am sure OpenRouter is brilliant, I'd still like to thoroughly test it.

This issue is rapidly becoming forum warfare and I will close it if contiributions are actively helpful to implementing the feature which I absolutely do want to implement ASAP

@gitcnd
Copy link

gitcnd commented Feb 10, 2025

I don't understand why this conversation is even taking place? Are you not using "Cursor" yourself?
Just click the "yolo" option, and tell it to incorporate an openrouter option, and it will fix all that for you.

@eastlondoner
Copy link
Owner

OK @kjanoudi made an actual PR - thank you!

Let's focus discussion on that

@gitcnd is banned
#25

@eastlondoner
Copy link
Owner

Just so that everyone knows that I was right. Google Gemini's search grounding and code execution functionality is not working via OpenRouter when following their docs and I've now spent a bunch of time on testing, documenting and feeding that back to OpenRouter, and it's not able to deliver parity with using Google's APIs directly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants