feat(llm): add support for custom OpenAI-compatible API base URL#24
Open
chinmay-sh wants to merge 2 commits intocalesthio:masterfrom
Open
feat(llm): add support for custom OpenAI-compatible API base URL#24chinmay-sh wants to merge 2 commits intocalesthio:masterfrom
chinmay-sh wants to merge 2 commits intocalesthio:masterfrom
Conversation
Allow configuration of LLM_BASE_URL environment variable to support self-hosted or alternative OpenAI-compatible API endpoints. This enables flexibility for users who need to use different API providers or local deployments while maintaining the same interface.
Contributor
|
This should realistically be extensible enough to support other Open-AI compatible interfaces like Ollama or other Self-Hostable options. See: |
timoteuszelle
added a commit
to timoteuszelle/Crucix
that referenced
this pull request
Mar 20, 2026
Merges PR calesthio#24 from chinmay-sh/oai-compatible. Adds LLM_BASE_URL env var to allow pointing Crucix at any OpenAI-compatible endpoint such as Ollama, enabling fully self-hosted LLM analysis without external API dependencies. Original PR: calesthio#24
Owner
|
I haven't been able to get to this yet because work has been busy, but I definitely plan to review it over the weekend. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request adds support for customizing the base URL used by the OpenAI provider, allowing integration with OpenAI-compatible APIs hosted at non-default endpoints. The main changes involve updating configuration files and refactoring the provider initialization logic to use the new
baseUrlparameter.Configuration enhancements:
LLM_BASE_URLenvironment variable to.env.exampleto allow overriding the default OpenAI API endpoint.crucix.config.mjsto read thebaseUrlvalue from the environment, making it configurable for the LLM provider.Provider logic updates:
createLLMProviderinlib/llm/index.mjsto pass thebaseUrlparameter to the OpenAI provider when initializing.OpenAIProviderclass inlib/llm/openai.mjsto use the configurablebaseUrlfor API requests, defaulting tohttps://api.openai.com/v1if not specified.