What needs to be done
Create a docker-compose.yml that runs the full pipeline with Ollama in one command.
Proposed approach
services:
ollama:
image: ollama/ollama
# pull model on first start
pipeline:
build: .
depends_on: [ollama]
environment:
LLM_BASE_URL: http://ollama:11434/v1/chat/completions
ports:
- "8780:8780"
Also needs:
- A
Dockerfile for the pipeline
- An entrypoint script that waits for Ollama to be ready
Example usage
docker compose up
# → Ollama + pipeline ready at http://localhost:8780
Acceptance criteria
What needs to be done
Create a
docker-compose.ymlthat runs the full pipeline with Ollama in one command.Proposed approach
Also needs:
Dockerfilefor the pipelineExample usage
docker compose up # → Ollama + pipeline ready at http://localhost:8780Acceptance criteria
docker compose upworks from a fresh clone