Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: update rebuild script #7225

Merged
merged 3 commits into from
Jun 28, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions barretenberg/acir_tests/reset_acir_tests.sh
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
cd ~/aztec-packages/noir/noir-repo
# Run from barretenberg/acir_tests
cd ../../noir/noir-repo
cargo clean
noirup -p .
cd test_programs && ./rebuild.sh

cd ~/aztec-packages/barretenberg/acir_tests
cd ../../../barretenberg/acir_tests
rm -rf acir_tests
39 changes: 36 additions & 3 deletions noir/noir-repo/test_programs/rebuild.sh
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
#!/usr/bin/env bash
set -e

NO_PARALLEL=${1:-}

process_dir() {
local dir=$1
local current_dir=$2
Expand Down Expand Up @@ -60,7 +62,38 @@ for dir in $current_dir/benchmarks/*; do
dirs_to_process+=("$dir")
done

pids=() # Array to hold PIDs of background processes
dirs_map=() # Array to map PIDs to directories

if [ -z $NO_PARALLEL ]; then
# Process directories in parallel
for dir in "${dirs_to_process[@]}"; do
process_dir "$dir" "$current_dir" & # Run process_dir in the background
pid=$! # Get PID of the last background command
pids+=($pid) # Add PID to the pids array
dirs_map[$pid]=$dir # Map PID to the directory being processed
done
else
# Process directories sequentially
for dir in "${dirs_to_process[@]}"; do
process_dir "$dir" "$current_dir" # Run process_dir in the foreground
pid=$! # Get PID of the last command
pids+=($pid) # Add PID to the pids array
dirs_map[$pid]=$dir # Map PID to the directory being processed
done
fi

# Check the exit status of each background job.
for pid in "${pids[@]}"; do
if ! wait $pid; then # Wait for the process to complete, check if it failed
echo "Rebuild failed for directory: ${dirs_map[$pid]}" # Print failed directory
exit_status=$? # Capture the failed exit status
Copy link
Contributor

@vezenovm vezenovm Jun 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't this just get the last exit status? Should we just exit here? This is a nit though and otherwise updates look good

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh yeah, I think I did correct this but I must've copy pasted the wrong version

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also, ! wait $pid should only be true if the exit status failed so its fine.
Shouldn't just exit here since I wanted to print all the failing ones.

This actually gets the exit status of the echo statement, so I need to move this line above

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like we now loop over all jobs and the failed pids. I feel we could still print failed jobs while looping over all the jobs but what you did keeps the script simple and we ideally shouldn't have that many failed jobs.

fi
done

parallel -j0 process_dir {} "$current_dir" ::: ${dirs_to_process[@]}

echo "Rebuild Succeeded!"
# Exit with a failure status if any job failed.
if [ ! -z "$exit_status" ]; then
echo "Rebuild failed!"
exit $exit_status
fi
echo "Rebuild Succeeded!"
Loading