Ready-to-deploy Docker image including the Functionary LLM served via an OpenAI-Compatible API.
Warning
This LLM model supports Function Calling. However It needs some hacking on the client-side
because Functionary vLLm script doesn't respect Open-AI API Function Calling format.
I will provide an example project on how to achieve that in the coming days.
Important
There is no possibility to set an API Key at the moment.
But I will work on adding that option as an environment variable.
In the meantime, don't share your endpoint with anybody.
- Default Port:
8000
- Path:
/v1
Example config for your OpenAI-compatible client (here using a RunPod endpoint):
{
"model": "musabgultekin/functionary-7b-v1",
"api_base": "https://[YOUR_CONTAINER_ID]-8000.proxy.runpod.net/v1",
"api_key": "functionary", // Dummy API Key since it can't be `null`
"api_type": "open_ai"
}
- Python 3.9
- Functionary LLM
- vLLM Server