-
Notifications
You must be signed in to change notification settings - Fork 367
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] llm Options #307
Comments
+1, this would also be nice if we could use with OpenRouter for many reasons (e.g. rate limits, consolidated billing, experimenting with different models to determine the output quality against cost) |
LiteLLM is another option. I second the request for gemini and other model options. |
This would be a great addition. We’re price sensitive it would help a lot. |
up for this, i wish to use deepseek API which has cheaper price compared the current option with roughly similar capabilities |
I would also like to be able to use DeepSeek, as its open source and can be downloaded. Should be one of the defaults togheter with the other two |
Background
I wish to experiment with other LLM models like Tongyi or Gemini, but it's not feasible.
stagehand/lib/llm/LLMProvider.ts
Line 62 in fecd42e
Aspiration
Provide a solution akin to browser-use, allowing the integration of custom models.
Benefits
The text was updated successfully, but these errors were encountered: