Skip to content

feat(llm): add support for custom OpenAI-compatible API base URL#24

Open
chinmay-sh wants to merge 2 commits intocalesthio:masterfrom
chinmay-sh:oai-compatible
Open

feat(llm): add support for custom OpenAI-compatible API base URL#24
chinmay-sh wants to merge 2 commits intocalesthio:masterfrom
chinmay-sh:oai-compatible

Conversation

@chinmay-sh
Copy link

This pull request adds support for customizing the base URL used by the OpenAI provider, allowing integration with OpenAI-compatible APIs hosted at non-default endpoints. The main changes involve updating configuration files and refactoring the provider initialization logic to use the new baseUrl parameter.

Configuration enhancements:

  • Added an optional LLM_BASE_URL environment variable to .env.example to allow overriding the default OpenAI API endpoint.
  • Updated crucix.config.mjs to read the baseUrl value from the environment, making it configurable for the LLM provider.

Provider logic updates:

  • Modified createLLMProvider in lib/llm/index.mjs to pass the baseUrl parameter to the OpenAI provider when initializing.
  • Updated the OpenAIProvider class in lib/llm/openai.mjs to use the configurable baseUrl for API requests, defaulting to https://api.openai.com/v1 if not specified.

Allow configuration of LLM_BASE_URL environment variable to support
self-hosted or alternative OpenAI-compatible API endpoints. This enables
flexibility for users who need to use different API providers or
local deployments while maintaining the same interface.
@chinmay-sh chinmay-sh requested a review from calesthio as a code owner March 17, 2026 08:36
@Guardian259
Copy link
Contributor

This should realistically be extensible enough to support other Open-AI compatible interfaces like Ollama or other Self-Hostable options.

See:
#30

timoteuszelle added a commit to timoteuszelle/Crucix that referenced this pull request Mar 20, 2026
Merges PR calesthio#24 from chinmay-sh/oai-compatible. Adds LLM_BASE_URL env var
to allow pointing Crucix at any OpenAI-compatible endpoint such as Ollama,
enabling fully self-hosted LLM analysis without external API dependencies.

Original PR: calesthio#24
@calesthio
Copy link
Owner

I haven't been able to get to this yet because work has been busy, but I definitely plan to review it over the weekend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants