-
Notifications
You must be signed in to change notification settings - Fork 6
Expand file tree
/
Copy path.env.example
More file actions
89 lines (70 loc) · 3.22 KB
/
.env.example
File metadata and controls
89 lines (70 loc) · 3.22 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
# Ask rUVnet - Environment Variables
# Copy this to .env and fill in your actual values
# ============================================
# LLM API Keys (set at least one)
# ============================================
# The app auto-detects all available providers and creates a fallback chain.
# Default chain: groq-free → groq-paid → openai → anthropic → together → openrouter → deepseek
# If one provider is rate-limited, it automatically tries the next.
#
# Set LLM_PROVIDER to override the primary (default: groq-free first)
# Options: groq, openai, anthropic, together, openrouter, deepseek
# LLM_PROVIDER=openai
# Groq FREE tier (1M tokens/day) - burns free quota first, then falls through to paid
# Get key at: https://console.groq.com
GROQ_API_KEY=your-groq-free-key-here
# Groq PAID tier - only used after free tier is exhausted
# Can be a different key from a paid org, or same key if your account is paid
GROQ_PAID_API_KEY=your-groq-paid-key-here
# GROQ_MODEL=llama-3.3-70b-versatile # Optional: override model for both tiers
# OpenAI - Get key at: https://platform.openai.com/api-keys
OPENAI_API_KEY=your-openai-key-here
# OPENAI_MODEL=gpt-4o # Optional: override default model
# Anthropic Claude - Get key at: https://console.anthropic.com
# Accepts either CLAUDE_API_KEY or ANTHROPIC_API_KEY
CLAUDE_API_KEY=your-claude-key-here
# ANTHROPIC_MODEL=claude-sonnet-4-20250514 # Optional: override default model
# Together AI
TOGETHER_API_KEY=your-together-key-here
# OpenRouter (Multi-model access)
OPENROUTER_API_KEY=your-openrouter-key-here
# DeepSeek
DEEPSEEK_API_KEY=your-deepseek-key-here
# Perplexity (not used for chat, listed for future use)
PERPLEXITY_API_KEY=your-perplexity-key-here
# Google Gemini (Video Processing only, not used for chat)
GOOGLE_GEMINI_API_KEY=your-gemini-api-key-here
# ============================================
# SERVER CONFIGURATION
# ============================================
# Server port (default: 3000)
PORT=3000
# Node environment
NODE_ENV=development
# Public URL (for Railway deployment)
PUBLIC_URL=https://your-app.up.railway.app
# ============================================
# DATABASE & KNOWLEDGE BASE
# ============================================
# Ruflo database path (relative to project root)
# RUFLO_DB_PATH takes priority; CLAUDE_FLOW_DB_PATH kept for backward compatibility
RUFLO_DB_PATH=.swarm/memory.db
CLAUDE_FLOW_DB_PATH=.swarm/memory.db
# Force use of transformer models for embeddings (required)
FORCE_TRANSFORMERS=true
# ============================================
# RAILWAY DEPLOYMENT
# ============================================
# These are automatically set by Railway, no need to add manually:
# - RAILWAY_ENVIRONMENT
# - RAILWAY_SERVICE_ID
# - RAILWAY_PROJECT_ID
# - RAILWAY_DEPLOYMENT_ID
# ============================================
# NOTES
# ============================================
# 1. NEVER commit .env to Git (it's in .gitignore)
# 2. For Railway/Render: Add these variables in the deployment dashboard
# 3. Minimum required: At least ONE of OPENAI_API_KEY, CLAUDE_API_KEY, or GROQ_API_KEY
# 4. Recommended: OPENAI_API_KEY + LLM_PROVIDER=openai (or CLAUDE_API_KEY + LLM_PROVIDER=anthropic)
# 5. The app creates a fallback chain from ALL available keys automatically