Skip to content

Commit b7d3887

Browse files
committed
Running with local ollama example
1 parent 0f0b5b0 commit b7d3887

File tree

1 file changed

+18
-1
lines changed

1 file changed

+18
-1
lines changed

README.md

Lines changed: 18 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -141,7 +141,24 @@ export OPENAI_BASE_URL="https://your-api-provider.com/v1"
141141
export OPENAI_API_KEY="your-api-key-here"
142142
```
143143

144-
This way, any 3rd party API can be used, including local models.
144+
This way, any 3rd party API can be used, such as OpenRouter, Groq, local models, etc.
145+
146+
### Ollama
147+
148+
For exmaple, to run the bot using Ollama:
149+
150+
1. Set up the environment:
151+
152+
```
153+
export OPENAI_BASE_URL=http://localhost:11434/v1
154+
export OPENAI_API_KEY=ollama
155+
```
156+
157+
2. Run the bot command:
158+
159+
```
160+
python bot.py --model llama3 --reference-models llama3 --reference-models mistral
161+
```
145162

146163
## Evaluation
147164

0 commit comments

Comments
 (0)