-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Display the provider/engine being used to run the model #783
Comments
That would be cool, interested in opening a PR. Need to fail silently if graphic not available. |
Interface comes from llama.cpp's llama-run FWIW |
yes this is why I added the or
where one line is added before the llama-run prompt |
Couldn't we add a llama-run --prompt "🐋 >" ... |
Maybe, I'm not sure if the underlying library for this (linenoise.cpp) handles utf8/emoji. Of course it could be added 😊 But might not be as simple as it first seems. |
Would be a nice feature though. |
it looks like it works if I replace hello in your forked repo https://github.com/ericcurtin/linenoise.cpp |
Ha cool, the reason I was sure, is there are several versions of this patch that are not merged: because linenoise.cpp is C++, converting everything to std::string might be enough, but you've already proved it works for the prompt part, so there might be enough enabled for the prompt feature of that library. |
Today if I launch the command on my mac with the command:
if my podman machine is started, it'll deploy it to podman and it it's stopped it'll use llama-run. So depending on when I'm running the exact same instruction, it's not using the same backend.
I should prefix all my commands with
but it says 'container engine', so I'm not sure it's a reliable flag as well as I could image it would still be able to default to
llama-run
commandso IMHO it would be nice if I got a small prompt
so I know it's using podman to run my model
or
or anything that could let me easily know where is running my model
(using --debug is producing way too much logs)
The text was updated successfully, but these errors were encountered: