You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I wanted to use Ollama as my local LLM, but I'm hosting it in a different docker container than my app.
When I try to connect to Ollama from my app, I get the following expected error since they're running on different containers: LLM error: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/chat (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f622c1ead90>: Failed to establish a new connection: [Errno 111] Connection refused'))
Hi, I wanted to use Ollama as my local LLM, but I'm hosting it in a different docker container than my app.
When I try to connect to Ollama from my app, I get the following expected error since they're running on different containers:
LLM error: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/chat (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f622c1ead90>: Failed to establish a new connection: [Errno 111] Connection refused'))
Here is my code:
Ideally, I would be able to instantiate
OllamaLLM
and set the base URL, something like the following (assuming my container is called "ollama"):However,
OllamaLLM
hardcodes the url, which makes sense given that it's meant to run locally.(https://github.com/aurelio-labs/semantic-router/blob/main/semantic_router/llms/ollama.py#L52)
I think a simple fix to add a
base_url
arg would be the following:Here's a draft PR on my fork: https://github.com/prbarcelon/semantic-router/pull/1/files
What are the team's thoughts? Thank you!
The text was updated successfully, but these errors were encountered: