it's better to support ollama #379
YinYongHongYork
started this conversation in
Ideas
Replies: 1 comment
-
hey as mentioned here, I will not be putting time in to support multiple model providers. These will have to come from the community and they will be use at your own discretion :) We have had successful contributions for Azure OpenAI and an ongoing one for Deepseek. Ollama has an added complexity as you would mostly likely need GPUs to run reasonable models in which case using Mac runners with MLX backend or external Nvidia (which i cannot test) is prefered. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Do you have a plan to support ollama? thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions