fix(nixos): fix prebuilt binaries via LD_LIBRARY_PATH + skip source build on read-only install#574
Open
jerome-benoit wants to merge 1 commit intotobi:mainfrom
Open
Conversation
There was a problem hiding this comment.
Pull request overview
This PR addresses NixOS/immutable-root failures when node-llama-cpp attempts to write compiled llama.cpp artifacts into a read-only node_modules path (e.g. under /nix/store) by redirecting build output to a user-writable cache directory.
Changes:
- Add
resolveLlamaBuildDir()(+QMD_LLAMA_BUILD_DIR) to compute a user-writable llama build directory following XDG cache conventions. - Patch
node-llama-cpp/dist/config.jsduring the Nix build to honorNODE_LLAMA_CPP_LOCAL_BUILDS_DIR, and set that env var in the Nix wrapper before JS starts. - Document the fix in
CHANGELOG.mdand ignore generatednode-llama-cpp-*.tgzartifacts.
Reviewed changes
Copilot reviewed 3 out of 4 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
| src/llm.ts | Adds a helper + default constant for resolving a writable llama build directory (XDG-based, env-overridable). |
| flake.nix | Applies a build-time patch to node-llama-cpp config and sets NODE_LLAMA_CPP_LOCAL_BUILDS_DIR in the wrapper. |
| CHANGELOG.md | Adds an Unreleased “Fixes” entry describing the NixOS redirect behavior and env vars. |
| .gitignore | Ignores node-llama-cpp-*.tgz artifacts. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
bd62472 to
a239e7e
Compare
a239e7e to
8bbb71c
Compare
25fb143 to
0c9f8cd
Compare
…uild on read-only install Closes tobi#87
2999f90 to
029a898
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
On NixOS (and other immutable-root systems),
qmd embedcrashes withEACCESbecause node-llama-cpp tries to compile llama.cpp into its read-onlynode_modules/directory inside/nix/store. Even when source builds are skipped, the prebuilt binaries fail because they expect FHS paths like/lib64/libc.so.6which don't exist on NixOS.Two fixes:
LD_LIBRARY_PATHin wrapper — the flake wrapper setsLD_LIBRARY_PATHto include Nix's glibc and libstdc++, covering what the prebuilt binaries need without modifying themcanWriteLlamaDir())Changes
LD_LIBRARY_PATHin the wrapper. No binary modification needed —LD_LIBRARY_PATHtakes precedence over RUNPATH fordlopen.canWriteLlamaDir()that probes write access on node-llama-cpp'sllama/directory. When unwritable,ensureLlama()passesbuild: "never"togetLlama().Closes #87
Testing
nix buildsucceeds./result/bin/qmd statusshows CPU backend loaded, no EACCES, no NoBinaryFoundError./result/bin/qmd embedcompletes successfully