Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: After update I get this error "Failed to obtain server version. Unable to check client-server compatibility. Set checkCompatibility=false to skip version check." #3477

Closed
giorgiocerruti opened this issue Mar 16, 2025 · 1 comment
Labels
possible bug Bug was reported but is not confirmed or is unable to be replicated.

Comments

@giorgiocerruti
Copy link

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

Hello.

After updating AnythingLLM to use with Ollama on my local pc, I cannot chat with any model.
Running it from the terminal I get this error

[backend] info: [Ollama] OllamaAILLM initialized with
model: assibotv1-llama32:latest
perf: base
n_ctx: 4096
Failed to obtain server version. Unable to check client-server compatibility. Set checkCompatibility=false to skip version check.
[backend] error: TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11457:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async fetchJson (/Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/openapi-typescript-fetch/dist/cjs/fetcher.js:135:22)
    at async /Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/js-client-rest/dist/cjs/api-client.js:46:26
    at async handler (/Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/openapi-typescript-fetch/dist/cjs/fetcher.js:156:16)
    at async /Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/js-client-rest/dist/cjs/api-client.js:32:24
    at async handler (/Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/openapi-typescript-fetch/dist/cjs/fetcher.js:156:16)
    at async fetchUrl (/Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/openapi-typescript-fetch/dist/cjs/fetcher.js:162:22)
    at async Object.fun [as clusterStatus] (/Applications/AnythingLLM.app/Contents/Resources/backend/node_modules/@qdrant/openapi-typescript-fetch/dist/cjs/fetcher.js:168:20)
    at async Object.connect (/Applications/AnythingLLM.app/Contents/Resources/backend/server.js:21:37523)

### Are there known steps to reproduce?

_No response_
@giorgiocerruti giorgiocerruti added the possible bug Bug was reported but is not confirmed or is unable to be replicated. label Mar 16, 2025
@timothycarambat
Copy link
Member

This seems to be coming from Ollama directly, not AnythingLLM - so Ollama crashes on boot. Do you know if this model only works on the most recent version of Ollama? We are on an older version than the current release at this time and if this model relies on some recent update or change, that could be the reason.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
possible bug Bug was reported but is not confirmed or is unable to be replicated.
Projects
None yet
Development

No branches or pull requests

2 participants