-
Notifications
You must be signed in to change notification settings - Fork 115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix exiting on llama-serve when user hits ^c #785
Conversation
Fixes: containers#753 Signed-off-by: Daniel J Walsh <[email protected]>
Reviewer's Guide by SourceryThis PR fixes an issue where llama-serve would exit unexpectedly on Ctrl+C by adding the '--init' flag to the container configuration. The change was implemented by modifying the container argument list in the setup_container function, ensuring that signal handling occurs correctly by initializing container processes properly. Sequence diagram for handling Ctrl+C with '--init' flagsequenceDiagram
actor User
participant LlamaServe
participant ContainerInit
User->>LlamaServe: Send SIGINT (Ctrl+C)
LlamaServe->>ContainerInit: Forward SIGINT (thanks to '--init')
ContainerInit-->>LlamaServe: Proper signal handling invoked
LlamaServe->>User: Graceful shutdown initiated
File-Level Changes
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @rhatdan - I've reviewed your changes - here's some feedback:
Overall Comments:
- It might be good to add a comment explaining why the
--init
flag is necessary.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
@ericcurtin I tested this locally and it works great with the latest llama.cpp |
"--init Run an init binary inside the container that forwards signals and reaps processes" that forwards signals... Make sense, great find! |
llama.cpp folks found it for me. |
Tested and works! |
Fixes: #753
Summary by Sourcery
Bug Fixes: