Skip to content

feat: add openai-compatible provider for local LLMs (LM Studio, Ollama)#55

Open
cspenn wants to merge 3 commits intocalesthio:masterfrom
cspenn:feature/openai-compatible-local-llm
Open

feat: add openai-compatible provider for local LLMs (LM Studio, Ollama)#55
cspenn wants to merge 3 commits intocalesthio:masterfrom
cspenn:feature/openai-compatible-local-llm

Conversation

@cspenn
Copy link

@cspenn cspenn commented Mar 19, 2026

Summary

  • Adds a new openai-compatible provider alias that accepts a custom LLM_BASE_URL, enabling local LLM runtimes like LM Studio and Ollama without an API key
  • Existing openai provider is completely unchanged
  • Authorization header is omitted automatically when no API key is set

Usage

LLM_PROVIDER=openai-compatible
LLM_MODEL=your-local-model-name
LLM_BASE_URL=http://localhost:1234/v1/chat/completions

Works with any OpenAI-compatible endpoint (LM Studio, Ollama, LiteLLM, vLLM, etc.).

Changes

  • crucix.config.mjs — reads LLM_BASE_URL from env
  • lib/llm/index.mjs — threads baseUrl through factory; adds openai-compatible case alias
  • lib/llm/openai.mjs — uses configurable URL; omits Authorization header when no key
  • .env.example — documents LLM_BASE_URL
  • README.md — documents openai-compatible provider with example config
  • test/llm-openai.test.mjs — 16 unit tests covering all new behaviour (all passing)

Test plan

  • node --test test/llm-openai.test.mjs — all 16 tests pass
  • node --test test/llm-*.test.mjs — all existing tests still pass
  • Manual: set LLM_PROVIDER=openai-compatible + LLM_BASE_URL=http://localhost:1234/v1/chat/completions with LM Studio running

Adds LLM_BASE_URL env var and an openai-compatible provider alias so users
can point Crucix at any OpenAI-compatible local endpoint without an API key.
The existing openai provider is unchanged.

- crucix.config.mjs: reads LLM_BASE_URL from env
- lib/llm/index.mjs: threads baseUrl through factory; adds openai-compatible alias
- lib/llm/openai.mjs: uses configurable URL, omits Authorization header when no key
- .env.example / README.md: documents LLM_BASE_URL and openai-compatible provider
- test/llm-openai.test.mjs: 16 unit tests covering all new behaviour
@cspenn cspenn requested a review from calesthio as a code owner March 19, 2026 21:48
cspenn added 2 commits March 19, 2026 18:13
…iles

Adds unit and integration tests for all previously untested source files,
achieving 100% file coverage (54/54 source files) with 412 passing tests.

New test files:
- test/llm-provider.test.mjs
- test/llm-anthropic.test.mjs
- test/llm-gemini.test.mjs
- test/llm-codex.test.mjs
- test/llm-ideas.test.mjs
- test/delta-engine.test.mjs
- test/delta-memory.test.mjs
- test/alerts-telegram.test.mjs
- test/alerts-discord.test.mjs
- test/utils-fetch.test.mjs
- test/utils-env.test.mjs
- test/i18n.test.mjs
- test/config.test.mjs
- test/source-{acled,adsb,bls,bluesky,comtrade,eia,epa,firms,fred,gdelt,
  gscpi,kiwisdr,noaa,ofac,opensanctions,opensky,patents,reddit,reliefweb,
  safecast,ships,space,telegram,treasury,usaspending,who,yfinance}.test.mjs
- test/briefing.test.mjs
- test/save-briefing.test.mjs
- test/dashboard-inject.test.mjs
- test/server.test.mjs
- test/diag.test.mjs
- test/clean.test.mjs

Also:
- package.json: add "test" script (node --test test/*.test.mjs)
- .gitignore: add docs/, data/, input/, logs/, src/, temp/, tests/
@calesthio
Copy link
Owner

Review findings from local verification:

  1. The new full test suite is not portable yet. Several tests hard-code the author machine path /Users/cspenn/Documents/github/Crucix, which breaks outside that environment. I hit this in test/clean.test.mjs, test/diag.test.mjs, test/save-briefing.test.mjs, and test/server.test.mjs.

  2. The new npm test entrypoint is unreliable on Windows. package.json uses node --test test/*.test.mjs, and in this environment npm test exited immediately with code 1 and no TAP output, even though invoking the test runner directly did execute.

  3. Runtime behavior for the existing openai provider changes in this PR, despite the summary saying it is unchanged. With LLM_PROVIDER=openai and LLM_BASE_URL set, requests can be routed to the custom endpoint instead of OpenAI. That is a material behavior change and should either be isolated to openai-compatible or explicitly documented as intentional.

  4. The .gitignore additions are too broad. Ignoring docs/ and tests/ will make future project docs and test additions easy to miss and hard to stage normally.

What I ran locally:

  • node --test test/llm-openai.test.mjs -> passed
  • npm test -> exited 1 immediately
  • node --test test/*.test.mjs -> 399 pass / 6 fail / 7 canceled

I did not find a new DOM/script injection issue in this PR. The main concerns are test portability, test entrypoint reliability, and the openai routing behavior.

@calesthio
Copy link
Owner

I haven't been able to get to this yet because work has been busy, but I definitely plan to review it over the weekend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants