diff --git a/daprdocs/content/en/reference/components-reference/supported-conversation/ollama.md b/daprdocs/content/en/reference/components-reference/supported-conversation/ollama.md index f3ffbe0e78f..a0dbac727fc 100644 --- a/daprdocs/content/en/reference/components-reference/supported-conversation/ollama.md +++ b/daprdocs/content/en/reference/components-reference/supported-conversation/ollama.md @@ -34,6 +34,26 @@ The above example uses secrets as plain strings. It is recommended to use a secr | `model` | N | The Ollama LLM to use. Defaults to `llama3.2:latest`. | `phi4:latest` | | `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` | +### OpenAI Compatibility + +Ollama is compatible with [OpenAI's API](https://ollama.com/blog/openai-compatibility). You can use the OpenAI component with Ollama models with the following changes: + +```yaml +apiVersion: dapr.io/v1alpha1 +kind: Component +metadata: + name: ollama-openai +spec: + type: conversation.openai # use the openai component type + metadata: + - name: key + value: 'ollama' # just any non-empty string + - name: model + value: gpt-oss:20b # an ollama model (https://ollama.com/search) in this case openai open source model. See https://ollama.com/library/gpt-oss + - name: endpoint + value: 'http://localhost:11434/v1' # ollama endpoint +``` + ## Related links - [Conversation API overview]({{< ref conversation-overview.md >}}) \ No newline at end of file