Skip to content

Conversation

@anthonywu
Copy link

@anthonywu anthonywu commented Aug 27, 2025

Updates

  • Add support for Firecrawl Self Hosting
  • Add support for Ollama AI provider, allow for users to configure the model
  • Update .env.example with clear example values
  • Parse the FIRECRAWL_API_URL and OLLAMA_HOST with some flexibility

Testing

  • Original hosted FC and hosted Groq workflow continues to work
  • New localhost FC and Ollama localhost works at initial search and llm stages

Not in scope

  • es-lint had warnings
  • any type usage
  • unnecessary distractions like whitespace updates (suppressed my IDEs to 🙈)
  • updating page elements or links to indicate local providers (I'd send a follow-up frontend PR if this one is accepted)

@anthonywu anthonywu changed the title Support Self Host Firecrawl and Ollama AI provider Feature for V2: Support Self Host Firecrawl and Ollama AI provider Aug 28, 2025
This was referenced Aug 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant