[FEAT]: Add backend support for using Fast Chat as the LLM Preference #940
Labels
enhancement
New feature or request
feature request
Integration Request
Request for support of a new LLM, Embedder, or Vector database
What would you like to see?
Hey guys, I would love to be able to use Fast Chat: https://github.com/lm-sys/FastChat as the backend to this. Since this is a local LLM Hoster with different APIs and is built to be scalable, it'd be a great backend to be able to use.
The text was updated successfully, but these errors were encountered: