feat: add openai-compatible provider for local LLMs (LM Studio, Ollama)#55
feat: add openai-compatible provider for local LLMs (LM Studio, Ollama)#55cspenn wants to merge 3 commits intocalesthio:masterfrom
Conversation
Adds LLM_BASE_URL env var and an openai-compatible provider alias so users can point Crucix at any OpenAI-compatible local endpoint without an API key. The existing openai provider is unchanged. - crucix.config.mjs: reads LLM_BASE_URL from env - lib/llm/index.mjs: threads baseUrl through factory; adds openai-compatible alias - lib/llm/openai.mjs: uses configurable URL, omits Authorization header when no key - .env.example / README.md: documents LLM_BASE_URL and openai-compatible provider - test/llm-openai.test.mjs: 16 unit tests covering all new behaviour
…iles
Adds unit and integration tests for all previously untested source files,
achieving 100% file coverage (54/54 source files) with 412 passing tests.
New test files:
- test/llm-provider.test.mjs
- test/llm-anthropic.test.mjs
- test/llm-gemini.test.mjs
- test/llm-codex.test.mjs
- test/llm-ideas.test.mjs
- test/delta-engine.test.mjs
- test/delta-memory.test.mjs
- test/alerts-telegram.test.mjs
- test/alerts-discord.test.mjs
- test/utils-fetch.test.mjs
- test/utils-env.test.mjs
- test/i18n.test.mjs
- test/config.test.mjs
- test/source-{acled,adsb,bls,bluesky,comtrade,eia,epa,firms,fred,gdelt,
gscpi,kiwisdr,noaa,ofac,opensanctions,opensky,patents,reddit,reliefweb,
safecast,ships,space,telegram,treasury,usaspending,who,yfinance}.test.mjs
- test/briefing.test.mjs
- test/save-briefing.test.mjs
- test/dashboard-inject.test.mjs
- test/server.test.mjs
- test/diag.test.mjs
- test/clean.test.mjs
Also:
- package.json: add "test" script (node --test test/*.test.mjs)
- .gitignore: add docs/, data/, input/, logs/, src/, temp/, tests/
|
Review findings from local verification:
What I ran locally:
I did not find a new DOM/script injection issue in this PR. The main concerns are test portability, test entrypoint reliability, and the |
|
I haven't been able to get to this yet because work has been busy, but I definitely plan to review it over the weekend. |
Summary
openai-compatibleprovider alias that accepts a customLLM_BASE_URL, enabling local LLM runtimes like LM Studio and Ollama without an API keyopenaiprovider is completely unchangedUsage
Works with any OpenAI-compatible endpoint (LM Studio, Ollama, LiteLLM, vLLM, etc.).
Changes
crucix.config.mjs— readsLLM_BASE_URLfrom envlib/llm/index.mjs— threadsbaseUrlthrough factory; addsopenai-compatiblecase aliaslib/llm/openai.mjs— uses configurable URL; omitsAuthorizationheader when no key.env.example— documentsLLM_BASE_URLREADME.md— documentsopenai-compatibleprovider with example configtest/llm-openai.test.mjs— 16 unit tests covering all new behaviour (all passing)Test plan
node --test test/llm-openai.test.mjs— all 16 tests passnode --test test/llm-*.test.mjs— all existing tests still passLLM_PROVIDER=openai-compatible+LLM_BASE_URL=http://localhost:1234/v1/chat/completionswith LM Studio running