This repository provides a Docker Compose setup for running Ollama with Traefik as a reverse proxy, making it accessible for Cursor AI integration.
Make sure you have the following prerequisites installed on your machine:
- Docker with Docker Compose
- Traefik (configured as your reverse proxy)
- Cursor IDE
Traefik is essential in this setup for several reasons:
- Secure Access: Traefik provides HTTPS termination, ensuring that all communications between Cursor and your Ollama instance are encrypted.
- Domain Routing: It allows you to expose Ollama through a custom domain, which is required for Cursor integration as the LLM should be accessible from the Cursor servers.
- Authentication: Traefik can handle authentication layers, adding security to your LLM endpoint.
- Load Balancing: If you scale your Ollama instances, Traefik can distribute the load effectively.
Without Traefik, you would need to manually configure SSL certificates and routing, which can be complex and error-prone.
-
Clone the Docker Compose repository:
git clone https://github.com/pezzos/ollama-docker.git
-
Change to the project directory:
cd ollama-docker
-
Configure your domain in the
.env
file:DOMAIN=your-domain.com
Start Ollama and its dependencies using Docker Compose:
docker-compose up -d
Visit https://chat. in your browser to access Ollama-webui.
Navigate to settings -> model and install a model (e.g., llava-phi3). This may take a couple of minutes, but afterward, you can use it just like ChatGPT.
-
Create an API key:
- Go to OpenWebui > settings > account > API key
- Create a new key if you don't have one
- Save this key securely
-
Add a model:
- Install the model you want to use (ex: deepseek-coder:1.3b)
- Wait for the installation to complete
- Verify the model is working in the web UI
- Open Cursor settings
- In the AI models section:
- Add the exact model name (case sensitive, ex: deepseek-coder:1.3b)
- Disable other models to ensure you're using the right one
- In the OpenAI configuration:
- Paste your OpenWebUI API key
- Add the API URL: https://chat./api
- Test the connection by trying a simple prompt
To stop the containers and remove the network:
docker-compose down
To completely remove all data (including models):
docker-compose down -v
We welcome contributions! If you'd like to contribute to the Ollama Docker Traefik Setup:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
Please ensure your code follows our coding standards and includes appropriate documentation.
This project is licensed under the MIT License. Feel free to use, modify, and distribute it according to the terms of the license. We appreciate attribution and mentions in derivative works.
If you encounter any issues or have questions:
- Check the existing issues on GitHub
- Create a new issue with detailed information about your problem
- Join our community discussions