-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for OpenRouter AI #6
Comments
Yes! I am evaluating OpenRouter and ModelBox for this. I’ve found that these providers do not always give the same behaviour as the original APIs so I will thoroughly test them first |
I'd also propose integrating with OpenAI's api as well, so that we can call o3 models directly - for example, after a gemini analysis of the files to include, we could have a second repomix call that only includes the relevant files and asks o3-mini-high for step by step implementation plan for sonnet to actually perform. |
yeah this is a separate feature tho for sure |
These are the original APIs, that's why it's got "router" in the name. Maybe you picked the wrong provider of the model in your test? AI itself never "gives the same behaviour" (unless you define a starting key and temperature) of course. |
@gitcnd openrouter transforms everything into an “OpenAI compatible Completion api These providers are not originally OpenAI compatible. Can you share an example of calling google Gemini on openrouter using google Gemini’s API SDK ? Or calling Anthropic using Anthropic’s SDK ? |
You don't, you just set the model to "google/gemini-2.0-pro-exp-02-05:free" or "anthropic/claude-3-5-sonnet-20241022" and call their OpenAI compatible endpoint and here you go. |
Yes, I understand that, they cannot be "the original APIs" and ALSO be "OpenAI compatible" at the same time - this was my point to the previous poster. I am sure OpenRouter is brilliant, I'd still like to thoroughly test it. This issue is rapidly becoming forum warfare and I will close it if contiributions are actively helpful to implementing the feature which I absolutely do want to implement ASAP |
I don't understand why this conversation is even taking place? Are you not using "Cursor" yourself? |
Just so that everyone knows that I was right. Google Gemini's search grounding and code execution functionality is not working via OpenRouter when following their docs and I've now spent a bunch of time on testing, documenting and feeding that back to OpenRouter, and it's not able to deliver parity with using Google's APIs directly |
Feature Request: OpenRouter AI Integration - One Key for Multiple Models
First of all, thanks for the great project!
Currently, using multiple AI models from different providers (Perplexity, Google) requires registering and managing API keys for each service individually. This can be cumbersome and inefficient.
I propose adding support for OpenRouter AI. OpenRouter acts as a unified API for accessing various AI models through a single API key. This would allow users to:
Is this a good idea? Please provide feedback and discuss the feasibility of implementing this feature.
The text was updated successfully, but these errors were encountered: