-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Refactor LLM settings #217
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Refactor LLM settings #217
Conversation
90c4e1c
to
c767da2
Compare
hello, thanks for your PR. refactor PRs need to be reviewed later, because it gets time to test all over |
Got it, thanks for the notice. |
5b366f2
to
e90c987
Compare
1213538
to
be27a1b
Compare
0819d82
to
4a64e7b
Compare
cac4bc3
to
8b98efb
Compare
6060152
to
3af76d0
Compare
Hi, @warmshao. Could you review this PR when possible? Thanks |
7910c5f
to
a88be4b
Compare
a88be4b
to
ffe3de5
Compare
Thank you @marginal23326 for this well-organized refactoring of the LLM settings. Centralizing the provider configurations into a single dictionary is a great step towards improving code maintainability and simplifying the addition of new providers. This change should make the The modifications in the test suite and environment file also seem well-aligned with the refactoring goals. We appreciate your contribution, and the maintainers will review the code shortly. |
ffe3de5
to
2e2db61
Compare
I moved all LLM provider configs (base url, default model names, etc.), into a single dict
PROVIDER_CONFIGS.
This should clean up
utils.py
(especially theget_llm_model
function), and make it easier to add new providers.