-
-
Notifications
You must be signed in to change notification settings - Fork 18
API Configuration - Bug Report #36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
It cannot use your custom endpoint. |
ollama models use this endpoint. I've provided the procedure to run model llama3.3. |
Take a look at Ollama API endpoints: |
baibot does not support Ollama's API, but Ollama provides an OpenAI-compatible API as documented here. Our sample baibot config file prepares |
Hello,
I am using ollama llama3.2 model.
The API uses /api/generate endpoint to produce responses, but baibot only recognizes /chat/completion. Shouldn't it infer the type and paths of the given API?
Running The Model
Testing The Model
Baibot Configuration
Problem
The problem is that baibot requests on /api/generate/chat/completions not /api/generate.
It automatically apends /chat/completions to the endpoint.
The text was updated successfully, but these errors were encountered: