LLM and embeddings #1063
thebrahman
started this conversation in
General
Replies: 2 comments
-
Yes. I'm currently running both in one container. So when I run
You can see I have 2 models.
Then when calling the embeddings API I set the model to |
Beta Was this translation helpful? Give feedback.
0 replies
-
Wait is that part of the UX of LocalAI or some other software? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
is it possible to run an LLM and embedding model simultaneously from the same docker container?
for example:

Beta Was this translation helpful? Give feedback.
All reactions