-
Notifications
You must be signed in to change notification settings - Fork 2.1k
1.Add Local Models #531
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
1.Add Local Models #531
Conversation
2.Fix Message think errors
liyan seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mrge found 4 issues across 3 files. View them in mrge.io
@@ -223,6 +224,7 @@ async def get_next_action(self, input_messages: list[BaseMessage]) -> AgentOutpu | |||
ai_content = ai_message.content | |||
|
|||
try: | |||
ai_content = re.sub(r"<think>.*?</think>", "", ai_content, flags=re.DOTALL).strip() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing comment explaining the purpose of removing tags from the AI response
@@ -180,7 +181,22 @@ def get_llm_model(provider: str, **kwargs): | |||
return ChatOpenAI( | |||
api_key=api_key, | |||
base_url=base_url, | |||
model_name=kwargs.get("model_name", "Qwen/QwQ-32B"), | |||
model=kwargs.get("model_name", "Qwen/QwQ-32B"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Parameter name changed from model_name to model for the siliconflow provider, indicating a bug fix. This suggests there may be other inconsistencies in other providers.
return ChatOpenAI( | ||
api_key=api_key, | ||
base_url=base_url, | ||
model=kwargs.get("model_name", ""), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Empty string as default model name for local provider could cause API errors if the model name is required
model=kwargs.get("model_name", "Qwen/QwQ-32B"), | ||
temperature=kwargs.get("temperature", 0.0), | ||
) | ||
elif provider == "local": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Local provider implementation duplicates API key and base_url extraction logic
Add Local Models |
Summary by mrge
Added support for local model integration and fixed message parsing issues. These changes allow users to connect to locally hosted models and improve message handling by removing "think" tags.
New Features
Bug Fixes
2.Fix Message think errors