-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: Model Ollama cannot connect #333
Comments
Do you mean on the demo Website or locally deployed? |
locally deployed. The error is as flws: hint : 102 As you know Ollama is really popular now for local machine. OK, I got your msg. IP on same LAN is key. Thanks |
Yes, I got the same promble with you. @mjiulee |
我又尝试了遍,神奇的可以了。 就是把url换成你装ollama的服务器的ip地址就行,例如:http://192.168.0.100:11434 然后就ok了~ |
真的,谢谢兄弟! |
@shaoxinghua0623 |
@matheospower |
Thank you for the answer! Unfortunately, this did not resolve my problem. Not sure If I need to open a new issue but I will post it here. My problem is that I get stuck in the pop-up to add an Ollama model. I tested the Ollama service (running), from outside and inside the ragflow-server with curl, and seems fine and can be reached. After setting the url in the pop-up and clicking ok, it is loading for some time and then gives me a connection time-out. Also I cannot see anything in the If anyone had a similar issue or can give a hint on how to troubleshoot please let me know! |
Hi , same issue there .... I did test using http://host.docker.internal:11434/ as a base url ( that's probably the way to go specially in a docker deployment model) but I got an error "Hint 102 : Fail to access model(/mistral).ERROR: [Errno -2] Name or service not known " ... |
Found a way to solve that issue , I had to change my ollama settings to "Environment="OLLAMA_HOST=PRIVATEIP" to get it exposed ... it looks like if you listen on the loopback (127.0.0.1) ragflow is not able to reach it |
Kool. Will give it a try. FINDINGS: If I edit my ollama.service file and set Environment="OLLAMA_HOST=PRIVATEIP" and systemctl start ollama.service - In browser PRIVATEIP:11434 => ollama is running. Fine. But in terminal - ollama list - all the models are missing!!! |
This didnt work for me either |
If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:
Then in docker you need to replace that localhost part with host.docker.internal. For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434/ you should put http://host.docker.internal:11434 into the connection URL in OLLAMA setup. Important |
我可以解决这个问题! |
This suggestion is so crucial that it needs to be inserted in the /docs/ollama.md imho. |
How can I make it work on Linux lol. |
Once |
Can't you use 'http://localhost:11434' to connect to ollama on the demo? You can only use 'http://localhost:11434' to connect to ollama after local deployment, is that right? If I want to add ollama3 to the demo, what is the best way? |
Hi, I ragflow via docker, ollama is in win local, I set the url to ' http://host.docker.internal:11434 ' and still get an error, do you know what's going on? If you can, can you help me out? |
@tslyellow On Windows when running a Linux container in WSL, if you want to reach a port on the Windows host, you need to add If it still does not work, it may be that |
你解决了吗? |
你解决了吗 |
|
你解决了吗 |
完全相同的步骤,甚至是新系统,新机器。感觉就是他们的框架有问题 |
几个月后神奇的又不可以了:( |
Same problem here... |
总结之前的思路,确保服务端器 开启ollama服务,防火墙开放11434端口,许可外部进行端口连接,good luck. |
You need to make sure, how your dockers network and therefore the containers network is set up. |
#I made it work like this using above comments: #add this to docker-compose and then use docker inspect to find IP which can be added in ragflow ui ollama: |
yes , in ubuntu , configured in systemctl service, tried restart to make it work. |
原来大家都卡在这里。。能不能直接把ollama放进ragflow的docker里? |
我成功了,兄弟们。首先要向这位哥们学习:
这是最简单的在这个docker里运行ollama的方法
检查运行的docker,记下ollama对应的名字以及对应的IP, 在网页端填入 172.xx.xx.xx:11434 看到这个就稳了一半 进入docker
然后正常使用,pull你想要的llm |
谢谢,但是如果我想用宿主机的Ollama该怎么办呢? |
Thanks. Will this have to pull models in the docker container? |
Thanks. It returns : |
Could you kindly share the operations on 'put http://host.docker.internal:11434/ into the connection URL in OLLAMA setup'? |
What does PRIVATEIP mean here? |
谢谢你,这个方法很好! |
Describe your problem
But LLM limited. Got Ollama - Mistral instance running at 127.0.0.1:11434 but cannot add Ollama as model in RagFlow. Please assist. This software is very good and flexible for document split-chunk-semantic for embedding. Many thanks
The text was updated successfully, but these errors were encountered: