Skip to content

copying from s3 cache buffers entire download in memory, causes segfault #12403

@andrewhamon

Description

@andrewhamon

Describe the bug

When copying a large path from an s3-based cache, the nix binary appears to buffer the entire download into memory.

When the path is large enough (above approximately 3.5GB) this also reliably causes nix to segfault.

I can only reproduce for an s3-based nix cache. If I use an HTTP cache, memory usage stays low and constant.

Steps To Reproduce

cd $(mktemp -d)
dd if=/dev/urandom of=./random_4g.bin bs=1M count=4096
path=$(nix store add-path . --name large-random)

# copy path to s3 store
nix copy --from local --to <the s3 cache> $path

# delete it from the local store
nix store delete $path

# attempt to copy from s3 store
nix copy --to local --from <the s3 cache> $path

# experience segfault
# 74861 segmentation fault  nix copy --to local /nix/store/rv559vmhs7751xizmfnxk5bwyjhfizpa-large-random

Expected behavior

Nix uses fixed amount of memory and does not segfault.

Metadata

nix-env (Nix) 2.22.0

but I have also experienced this with nix 2.25.0

Additional context

Checklist


Add 👍 to issues you find important.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions