You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When copying a large path from an s3-based cache, the nix binary appears to buffer the entire download into memory.
When the path is large enough (above approximately 3.5GB) this also reliably causes nix to segfault.
I can only reproduce for an s3-based nix cache. If I use an HTTP cache, memory usage stays low and constant.
Steps To Reproduce
cd$(mktemp -d)
dd if=/dev/urandom of=./random_4g.bin bs=1M count=4096
path=$(nix store add-path . --name large-random)# copy path to s3 store
nix copy --from local --to <the s3 cache>$path# delete it from the local store
nix store delete $path# attempt to copy from s3 store
nix copy --to local --from <the s3 cache>$path# experience segfault# 74861 segmentation fault nix copy --to local /nix/store/rv559vmhs7751xizmfnxk5bwyjhfizpa-large-random
Expected behavior
Nix uses fixed amount of memory and does not segfault.
Describe the bug
When copying a large path from an s3-based cache, the nix binary appears to buffer the entire download into memory.
When the path is large enough (above approximately 3.5GB) this also reliably causes nix to segfault.
I can only reproduce for an s3-based nix cache. If I use an HTTP cache, memory usage stays low and constant.
Steps To Reproduce
Expected behavior
Nix uses fixed amount of memory and does not segfault.
Metadata
but I have also experienced this with nix 2.25.0
Additional context
Checklist
Add 👍 to issues you find important.
The text was updated successfully, but these errors were encountered: