-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
failed: I/O exception during sandboxed execution: No such file or directory #22151
Comments
@Ryang20718 Could you please provide sample code and complete steps to reproduce this issue? |
I don't have a reliable repro; it just periodically happens when running large amounts of tests. we're in the process of upgrading to bazel 7, but still need to upgrade some dependencies to get there |
To provide another data point: I have hit the similar error message in It is reproducible when the same cache is used but not consistent across executions. For example, the remote execution passed fine. That run used Bazel 7.1.1. |
Does everyone affected by this NOT have dynamic execution enabled? |
No dynamic execution was enabled for |
no dynamic execution (this is with a local execution with remote cache) |
Aha, the remote cache bit is interesting too. @mattyclarkson did you have remote cache enabled too? |
The "remote" build passed which was using remote execution and remote cache. The "local" build failed witch was running locally on the GitLab runner instance and was using a disk cache (which is stored/restored from the GitLab runner S3 bucket). |
adding some details here: We've seen this same error in the following situations:
Originally I had thought it was a system error, but the 2nd bullet point indicates otherwise. (also had plenty of inodes + disk storage) |
@oquenchil While I don't understand why exactly this is happening, it looks like
|
Agree with Fabian diagnosis here. The most consistent theme from the reports in Slack was Currently, on the java side, we are catching IOException here. Since stats collection should be a non-critical feature, it should be done on a best-effort basis. The fix should be |
This was already the case for "local" spawns. Statistics may be missing if the spawn wrapper exits abnormally. Fixes bazelbuild#22151. Closes bazelbuild#22780. PiperOrigin-RevId: 644378541 Change-Id: Ia3d792f380b78945523f21875c593744b60f0c81
This was already the case for "local" spawns. Statistics may be missing if the spawn wrapper exits abnormally. Fixes bazelbuild#22151. Closes bazelbuild#22780. PiperOrigin-RevId: 644378541 Change-Id: Ia3d792f380b78945523f21875c593744b60f0c81
#22790) This was already the case for "local" spawns. Statistics may be missing if the spawn wrapper exits abnormally. Fixes #22151. Closes #22780. PiperOrigin-RevId: 644378541 Change-Id: Ia3d792f380b78945523f21875c593744b60f0c81 Commit ec41dd1 Co-authored-by: Fabian Meumertzheim <fabian@meumertzhe.im>
#22791) This was already the case for "local" spawns. Statistics may be missing if the spawn wrapper exits abnormally. Fixes #22151. Closes #22780. PiperOrigin-RevId: 644378541 Change-Id: Ia3d792f380b78945523f21875c593744b60f0c81 Commit ec41dd1 Co-authored-by: Fabian Meumertzheim <fabian@meumertzhe.im>
A fix for this issue has been included in Bazel 7.2.1 RC2. Please test out the release candidate and report any issues as soon as possible. |
I've pinned |
Hit an issue on CI run of
|
@mattyclarkson That's a different type of bug as it's not about |
Description of the bug:
Periodically, the following error would occur when running tests
we're on bazel 6.5.0 with spawn strategy
linux-sandbox
, jobs set to 1:1 with vcpus, sandbox mounted at/dev/shm
Whenever this occurs, we see system memory usage at 82-83% with cpu maxed at 100%.
Which category does this issue belong to?
No response
What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
I don't have a reliable repro. it happens sporadically
Which operating system are you running Bazel on?
ubuntu 20.04
What is the output of
bazel info release
?release 6.5.0
If
bazel info release
returnsdevelopment version
or(@non-git)
, tell us how you built Bazel.No response
What's the output of
git remote get-url origin; git rev-parse HEAD
?No response
Is this a regression? If yes, please try to identify the Bazel commit where the bug was introduced.
this has been occuring more frequently since we switched to 6.5.0 from 6.3.2
Have you found anything relevant by searching the web?
No response
Any other information, logs, or outputs that you want to share?
No response
The text was updated successfully, but these errors were encountered: