You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the past I've used Ollama for local inference for LLMs, would it be useful to add this support to the library? I'd be happy to work on this to add support to use an Ollama API endpoint for the LLM