You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Feature Request: Integrate Ollama for Local LLM Support
Overview
Add support for [Ollama](https://ollama.ai) to enable users to run local language models like Llama 3, Mistral, and others directly from Zola, enhancing privacy and reducing API costs.
Motivation
Currently, Zola supports cloud-based AI models like OpenAI and Mistral which require API keys and internet connectivity. Adding Ollama support would:
Allow users to run models locally on their own hardware
Add Ollama environment variable to .env.local and documentation:
# Ollama configuration
OLLAMA_BASE_URL=http://localhost:11434 # Default URL for local Ollama
5. Add Ollama Health Check Endpoint (Optional)
Create a new API endpoint at app/api/ollama-health/route.ts:
exportasyncfunctionGET(req: Request){try{constollamaUrl=process.env.OLLAMA_BASE_URL||"http://localhost:11434"constresponse=awaitfetch(`${ollamaUrl}/api/tags`)if(!response.ok){returnnewResponse(JSON.stringify({error: "Ollama service is not responding correctly"}),{status: response.status})}constdata=awaitresponse.json()returnnewResponse(JSON.stringify({status: "available",models: data.models}),{status: 200,})}catch(err: any){returnnewResponse(JSON.stringify({error: "Failed to connect to Ollama service",details: err.message}),{status: 500})}}
Feature Request: Integrate Ollama for Local LLM Support
Overview
Add support for [Ollama](https://ollama.ai) to enable users to run local language models like Llama 3, Mistral, and others directly from Zola, enhancing privacy and reducing API costs.
Motivation
Currently, Zola supports cloud-based AI models like OpenAI and Mistral which require API keys and internet connectivity. Adding Ollama support would:
Implementation Details
1. Add Ollama SDK
Install the Ollama SDK package (https://sdk.vercel.ai/providers/community-providers/ollama)
2. Update Configuration
Modify
app/lib/config.ts
to add Ollama models and provider:3. Create Ollama Icon Component
Create a new file at
components/icons/ollama.tsx
:4. Environment Variable Configuration
Add Ollama environment variable to
.env.local
and documentation:5. Add Ollama Health Check Endpoint (Optional)
Create a new API endpoint at
app/api/ollama-health/route.ts
:6. Docker Compose Integration
Update
docker-compose.yml
to add Ollama service:Expected Behavior
Documentation Updates Needed
Considerations
Related Issues
References
The text was updated successfully, but these errors were encountered: