Skip to content

[New Feature] Add direct support for Local Model Servers (Ollama, Lemonade, LM Studio) #59

@huhlig

Description

@huhlig

It would be great to add support for local/self-hosted LLM providers such as:

Ollama
LM Studio
Lemonade

This would allow NeverWrite to work with fully local models in addition to cloud APIs.

Why This Would Help

Many users are moving toward local inference for:

Privacy and data ownership
Offline usage
Lower API costs
Faster experimentation
Self-hosted/enterprise environments

NeverWrite looks to be an excellent upgrade to Obsidian and local model support would make it even more flexible for those with privacy concerns.

Metadata

Metadata

Assignees

No one assigned

    Labels

    help wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions