You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running jobs that have huge provenance files, the CLI function _nix_hashes receives too many arguments and breaks due to ARG_MAX limitations (depending on the OS).
Specifically in the *paths argument provided to the shell (line 334).
Let's make _nix_hashes receive bytes from stdin and then call nix-store using xargs so it does not hit the ARG_MAX limit.
I found this bug by running m gitlab:fluidattacks/universe@trunk /docs/generate/criteria, as it currently depends on several hundred derivations for building the entire criteria section of the website.
The text was updated successfully, but these errors were encountered:
- Make _nix_hashes receive bytes so it is compatible with stdin
- Make _nix_hashes process provided paths using xargs to avoid
hitting ARG_MAX limit
- Add findutils to runtime so xargs is available
dsalaza4
added a commit
to dsalaza4/makes
that referenced
this issue
Aug 10, 2023
- Make _nix_hashes receive bytes so it is compatible with stdin
- Make _nix_hashes process provided paths using xargs to avoid
hitting ARG_MAX limit
- Add findutils to runtime so xargs is available
Signed-off-by: Daniel Salazar <podany270895@gmail.com>
When running jobs that have huge provenance files, the CLI function
_nix_hashes
receives too many arguments and breaks due toARG_MAX
limitations (depending on the OS).Here is where the bug happens
makes/src/cli/main/cli.py
Lines 329 to 340 in 2244b2b
Specifically in the
*paths
argument provided to the shell (line 334).Let's make
_nix_hashes
receive bytes from stdin and then callnix-store
usingxargs
so it does not hit theARG_MAX
limit.I found this bug by running
m gitlab:fluidattacks/universe@trunk /docs/generate/criteria
, as it currently depends on several hundred derivations for building the entire criteria section of the website.The text was updated successfully, but these errors were encountered: