Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attempt to use build_llama_and_whisper.sh #795

Merged
merged 1 commit into from
Feb 12, 2025

Conversation

rhatdan
Copy link
Member

@rhatdan rhatdan commented Feb 12, 2025

Summary by Sourcery

Use a script to build llama.cpp and whisper.cpp in the CI workflow.

Enhancements:

  • Improve the build process for llama.cpp and whisper.cpp.

CI:

  • Build llama.cpp and whisper.cpp using the build_llama_and_whisper.sh script instead of building them directly in the workflow.
  • Run dnf_install and dnf -y clean all only if dnf is available.

Copy link
Contributor

sourcery-ai bot commented Feb 12, 2025

Reviewer's Guide by Sourcery

This pull request streamlines the CI workflow by replacing manual build steps with a dedicated build script and includes safeguards in the build script to ensure that dnf commands are executed only when available.

Sequence diagram for streamlined CI build process

sequenceDiagram
    participant CI as CI Workflow
    participant BS as Build Script
    participant DNF as dnf (optional)
    participant W as Whisper Build
    participant L as Llama Build

    CI->>BS: Execute build_llama_and_whisper.sh
    BS->>BS: Configure common flags
    alt dnf is available
      BS->>DNF: command -v dnf (check availability)
      BS->>DNF: dnf_install
    else dnf is not available
      BS->>BS: Skip dnf_install
    end
    BS->>W: clone_and_build_whisper_cpp
    BS->>BS: Append '-DLLAMA_CURL=ON' flag
    BS->>L: clone_and_build_llama_cpp
    alt dnf is available
      BS->>DNF: command -v dnf (check availability)
      BS->>DNF: dnf -y clean all
    else dnf is not available
      BS->>BS: Skip dnf cleanup
    end
    BS->>BS: Remove caches and run ldconfig
Loading

File-Level Changes

Change Details Files
Replace inline build commands with a dedicated build script in the CI configuration.
  • Removed manual commands to clone, update submodules, and build llama.cpp from the CI workflow.
  • Replaced manual build steps with an invocation of the build_llama_and_whisper.sh script using sudo.
.github/workflows/ci.yml
Introduce conditional checks in the build script for dnf operations.
  • Modified the script to check for the presence of dnf before running dnf_install.
  • Added a conditional check to ensure dnf exists before running the cleanup command (dnf clean all).
container-images/scripts/build_llama_and_whisper.sh

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @rhatdan - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Consider adding a comment to explain why the llama.cpp build steps were moved into a separate script.
  • It might be good to check for the existence of dnf before calling it, rather than relying on command -v dnf && dnf.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@@ -62,12 +62,7 @@ jobs:
sudo apt-get update
sudo apt-get install podman bats bash codespell python3-argcomplete pipx git cmake
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

libcurl4-openssl-dev package will likely fix the build failure

@@ -132,10 +132,9 @@ main() {
set_install_prefix
local common_flags
configure_common_flags
common_flags+=("-DGGML_CCACHE=OFF" "-DCMAKE_INSTALL_PREFIX=$install_prefix")
common_flags+=("-DGGML_CCACHE=OFF" "-DCMAKE_INSTALL_PREFIX=$install_prefix -DLLAMA_CURL=OFF")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah you fixed it already, happy like this :)

A user might want to add this again (although RamaLama never uses this), we can just re-add if that happens

@@ -60,14 +60,9 @@ jobs:
shell: bash
run: |
sudo apt-get update
sudo apt-get install podman bats bash codespell python3-argcomplete pipx git cmake
sudo apt-get install podman bats bash codespell python3-argcomplete pipx git cmake curl
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What we actually want is this, curl is just the CLI tool:

libcurl4-openssl-dev

@ericcurtin
Copy link
Collaborator

huggingface-cli failures

@ericcurtin ericcurtin merged commit 8dd7ec4 into containers:main Feb 12, 2025
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants