Skip to content

Added support for local LLM API#2

Open
Jhabkin wants to merge 2 commits intowierdbytes:masterfrom
Jhabkin:local-llm-support
Open

Added support for local LLM API#2
Jhabkin wants to merge 2 commits intowierdbytes:masterfrom
Jhabkin:local-llm-support

Conversation

@Jhabkin
Copy link
Copy Markdown

@Jhabkin Jhabkin commented Apr 7, 2026

I'm using Claude Code with local LLM running with llama-cpp and I wanted to look inside the requests, how many useless context is being passed to the model.
What has been added:

  • On startap server gets endpoint to be used with claude code from settings
  • If ANTHROPIC_BASE_URL is not configured, using anthropic api as deafult
  • Now all requests are intercepted and proxied to defined endpoint
  • Add ui element showing what is being used: anthropic API or local LLM

@Jhabkin Jhabkin force-pushed the local-llm-support branch from 1f40360 to 7b4d27a Compare April 7, 2026 21:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant