Skip to content

API Configuration - Bug Report #36

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
arashatt opened this issue Mar 19, 2025 · 4 comments
Open

API Configuration - Bug Report #36

arashatt opened this issue Mar 19, 2025 · 4 comments

Comments

@arashatt
Copy link

Hello,
I am using ollama llama3.2 model.
The API uses /api/generate endpoint to produce responses, but baibot only recognizes /chat/completion. Shouldn't it infer the type and paths of the given API?

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Running The Model

docker exec ollama ollama run llama3.2

Testing The Model

curl http://localhost:11434/api/generate -d '{
                                     "model": "llama3.2",
                                     "prompt":"Why is the sky blue?"
                                   }'

Baibot Configuration

     - id: ollama
       provider: ollama
       config:
         base_url: "http://188.245.96.102:11434/api/generate"
         api_key: null

         text_generation:
           model_id: "llama3.2"
           prompt: "You are a brief, but helpful bot called {{ baibot_name }} powered by the {{ baibot_model_id }} model. The date/time of this conversation's start is: {{ baibot_conversation_start_time_utc }}."
           temperature: 1.0
           max_response_tokens: 4096
           max_context_tokens: 128000

Problem

The problem is that baibot requests on /api/generate/chat/completions not /api/generate.
It automatically apends /chat/completions to the endpoint.

Image

@spantaleev
Copy link
Contributor

base_url is meant to be pointed to the base URL of an OpenAI-compatible API. The agent will then call the API endpoints(paths) that it needs, one of which being /chat/completions.

It cannot use your custom endpoint.

@arashatt
Copy link
Author

ollama models use this endpoint. I've provided the procedure to run model llama3.3.
By the way, thank you for your clarification.

@arashatt
Copy link
Author

Take a look at Ollama API endpoints:
https://github.com/ollama/ollama/blob/main/docs/api.md

@spantaleev
Copy link
Contributor

baibot does not support Ollama's API, but Ollama provides an OpenAI-compatible API as documented here.

Our sample baibot config file prepares ollama configuration for an Ollama service running on Docker Compose, so you can get inspired either by that or by the docs/sample-provider-configs/ollama.yml file used for creating Ollama agents dynamically (via chat)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants