Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Measure block building times, history processing times, and db sizes #2733

Merged
merged 7 commits into from
Oct 11, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 18 additions & 5 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -955,9 +955,20 @@ jobs:
name: "Benchmark"
command: cond_run_script end-to-end ./scripts/run_tests_local benchmarks/bench_publish_rollup.test.ts
environment:
{
DEBUG: "aztec:benchmarks:*,aztec:sequencer,aztec:world_state,aztec:merkle_trees",
}
DEBUG: "aztec:benchmarks:*,aztec:sequencer,aztec:sequencer:*,aztec:world_state,aztec:merkle_trees"

bench-process-history:
machine:
image: ubuntu-2204:2023.07.2
resource_class: large
steps:
- *checkout
- *setup_env
- run:
name: "Benchmark"
command: cond_run_script end-to-end ./scripts/run_tests_local benchmarks/bench_process_history.test.ts
environment:
DEBUG: "aztec:benchmarks:*,aztec:sequencer,aztec:sequencer:*,aztec:world_state,aztec:merkle_trees"

build-docs:
machine:
Expand Down Expand Up @@ -1307,6 +1318,7 @@ workflows:
- guides-sample-dapp: *e2e_test
- guides-up-quick-start: *e2e_test
- bench-publish-rollup: *e2e_test
- bench-process-history: *e2e_test

- e2e-end:
requires:
Expand Down Expand Up @@ -1344,15 +1356,16 @@ workflows:
- guides-dapp-testing
- guides-sample-dapp
- guides-up-quick-start
- bench-publish-rollup
<<: *defaults

- bench-summary:
requires:
- e2e-end
- bench-publish-rollup
- bench-process-history
<<: *defaults

# Deployment and Canary tests
# Deployment and Canary tests
- deploy-dockerhub:
requires:
- e2e-end
Expand Down
62 changes: 57 additions & 5 deletions scripts/ci/aggregate_e2e_benchmark.js
Original file line number Diff line number Diff line change
Expand Up @@ -27,14 +27,26 @@ const {
NOTE_SUCCESSFUL_DECRYPTING_TIME,
NOTE_TRIAL_DECRYPTING_TIME,
NOTE_PROCESSOR_CAUGHT_UP,
L2_BLOCK_BUILT,
L2_BLOCK_BUILD_TIME,
L2_BLOCK_ROLLUP_SIMULATION_TIME,
L2_BLOCK_PUBLIC_TX_PROCESS_TIME,
NODE_HISTORY_SYNC_TIME,
NODE_SYNCED_CHAIN,
NOTE_HISTORY_TRIAL_DECRYPTING_TIME,
NOTE_HISTORY_SUCCESSFUL_DECRYPTING_TIME,
PXE_DB_SIZE,
ROLLUP_SIZES,
CHAIN_LENGTHS,
BENCHMARK_FILE_JSON,
BLOCK_SIZE,
NODE_DB_SIZE,
} = require("./benchmark_shared.js");

// Folder where to load logs from
const logsDir = process.env.LOGS_DIR ?? `log`;

// Appends a datapoint to the final results for the given metric in the given bucket
// Appends a data point to the final results for the given metric in the given bucket
function append(results, metric, bucket, value) {
if (value === undefined) {
console.error(`Undefined value for ${metric} in bucket ${bucket}`);
Expand Down Expand Up @@ -79,13 +91,49 @@ function processCircuitSimulation(entry, results) {
}

// Processes an entry with event name 'note-processor-caught-up' and updates results
// Buckets are rollup sizes
// Buckets are rollup sizes for NOTE_DECRYPTING_TIME, or chain sizes for NOTE_HISTORY_DECRYPTING_TIME
function processNoteProcessorCaughtUp(entry, results) {
const { seen, decrypted } = entry;
const { seen, decrypted, blocks, duration, dbSize } = entry;
if (ROLLUP_SIZES.includes(decrypted))
append(results, NOTE_SUCCESSFUL_DECRYPTING_TIME, decrypted, entry.duration);
append(results, NOTE_SUCCESSFUL_DECRYPTING_TIME, decrypted, duration);
if (ROLLUP_SIZES.includes(seen) && decrypted === 0)
append(results, NOTE_TRIAL_DECRYPTING_TIME, seen, entry.duration);
append(results, NOTE_TRIAL_DECRYPTING_TIME, seen, duration);
if (CHAIN_LENGTHS.includes(blocks) && decrypted > 0) {
append(results, NOTE_HISTORY_SUCCESSFUL_DECRYPTING_TIME, blocks, duration);
append(results, PXE_DB_SIZE, blocks, dbSize);
}
if (CHAIN_LENGTHS.includes(blocks) && decrypted === 0)
append(results, NOTE_HISTORY_TRIAL_DECRYPTING_TIME, blocks, duration);
}

// Processes an entry with event name 'l2-block-built' and updates results
// Buckets are rollup sizes
function processL2BlockBuilt(entry, results) {
const bucket = entry.txCount;
if (!ROLLUP_SIZES.includes(bucket)) return;
append(results, L2_BLOCK_BUILD_TIME, bucket, entry.duration);
append(
results,
L2_BLOCK_ROLLUP_SIMULATION_TIME,
bucket,
entry.rollupCircuitsDuration
);
append(
results,
L2_BLOCK_PUBLIC_TX_PROCESS_TIME,
bucket,
entry.publicProcessDuration
);
}

// Processes entries with event name node-synced-chain-history emitted by benchmark tests
// Buckets are chain lengths
function processNodeSyncedChain(entry, results) {
const bucket = entry.blockCount;
if (!CHAIN_LENGTHS.includes(bucket)) return;
if (entry.txsPerBlock !== BLOCK_SIZE) return;
append(results, NODE_HISTORY_SYNC_TIME, bucket, entry.duration);
append(results, NODE_DB_SIZE, bucket, entry.dbSize);
}

// Processes a parsed entry from a logfile and updates results
Expand All @@ -99,6 +147,10 @@ function processEntry(entry, results) {
return processCircuitSimulation(entry, results);
case NOTE_PROCESSOR_CAUGHT_UP:
return processNoteProcessorCaughtUp(entry, results);
case L2_BLOCK_BUILT:
return processL2BlockBuilt(entry, results);
case NODE_SYNCED_CHAIN:
return processNodeSyncedChain(entry, results);
default:
return;
}
Expand Down
28 changes: 25 additions & 3 deletions scripts/ci/benchmark_shared.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,16 @@
// Rollup sizes to track (duplicated from yarn-project/end-to-end/src/benchmarks/bench_publish_rollup.test.ts)
// Block sizes to track (duplicated from yarn-project/end-to-end/src/benchmarks/bench_publish_rollup.test.ts)
const ROLLUP_SIZES = process.env.ROLLUP_SIZES
? process.env.ROLLUP_SIZES.split(",").map(Number)
: [8, 32, 128];

// Block size to use for building chains of multiple length (duplicated from yarn-project/end-to-end/src/benchmarks/bench_process_history.test.ts)
const BLOCK_SIZE = process.env.BLOCK_SIZE ? +process.env.BLOCK_SIZE : 16;

// Chain lengths to test (duplicated from yarn-project/end-to-end/src/benchmarks/bench_process_history.test.ts)
const CHAIN_LENGTHS = process.env.CHAIN_LENGTHS
? process.env.CHAIN_LENGTHS.split(",").map(Number)
: [10, 20, 30];

// Output files
const BENCHMARK_FILE_JSON = process.env.BENCHMARK_FILE_JSON ?? "benchmark.json";

Expand All @@ -15,14 +23,28 @@ module.exports = {
CIRCUIT_SIMULATION_TIME: "circuit_simulation_time_in_ms",
CIRCUIT_INPUT_SIZE: "circuit_input_size_in_bytes",
CIRCUIT_OUTPUT_SIZE: "circuit_output_size_in_bytes",
NOTE_SUCCESSFUL_DECRYPTING_TIME: "note_successful_decrypting_time",
NOTE_TRIAL_DECRYPTING_TIME: "note_unsuccessful_decrypting_time",
NOTE_SUCCESSFUL_DECRYPTING_TIME: "note_successful_decrypting_time_in_ms",
NOTE_TRIAL_DECRYPTING_TIME: "note_trial_decrypting_time_in_ms",
L2_BLOCK_BUILD_TIME: "l2_block_building_time_in_ms",
L2_BLOCK_ROLLUP_SIMULATION_TIME: "l2_block_rollup_simulation_time_in_ms",
L2_BLOCK_PUBLIC_TX_PROCESS_TIME: "l2_block_public_tx_process_time_in_ms",
NODE_HISTORY_SYNC_TIME: "node_history_sync_time_in_ms",
NOTE_HISTORY_SUCCESSFUL_DECRYPTING_TIME:
"note_history_successful_decrypting_time_in_ms",
NOTE_HISTORY_TRIAL_DECRYPTING_TIME:
"note_history_trial_decrypting_time_in_ms",
NODE_DB_SIZE: "node_database_size_in_bytes",
PXE_DB_SIZE: "pxe_database_size_in_bytes",
// Events to track
L2_BLOCK_PUBLISHED_TO_L1: "rollup-published-to-l1",
L2_BLOCK_SYNCED: "l2-block-handled",
L2_BLOCK_BUILT: "l2-block-built",
CIRCUIT_SIMULATED: "circuit-simulation",
NOTE_PROCESSOR_CAUGHT_UP: "note-processor-caught-up",
NODE_SYNCED_CHAIN: "node-synced-chain-history",
// Other
ROLLUP_SIZES,
BLOCK_SIZE,
CHAIN_LENGTHS,
BENCHMARK_FILE_JSON,
};
134 changes: 118 additions & 16 deletions scripts/ci/comment_e2e_benchmark.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,34 +5,136 @@
const https = require("https");
const fs = require("fs");

const GITHUB_TOKEN = process.env.GITHUB_TOKEN;
const GITHUB_TOKEN = process.env.AZTEC_BOT_COMMENTER_GITHUB_TOKEN;
const OWNER = "AztecProtocol";
const REPO = "aztec3-packages";
const COMMENT_MARK = "<!-- AUTOGENERATED BENCHMARK COMMENT -->";

const { ROLLUP_SIZES, BENCHMARK_FILE_JSON } = require("./benchmark_shared.js");
const {
ROLLUP_SIZES,
BLOCK_SIZE,
BENCHMARK_FILE_JSON,
L1_ROLLUP_CALLDATA_SIZE_IN_BYTES,
L1_ROLLUP_CALLDATA_GAS,
L1_ROLLUP_EXECUTION_GAS,
L2_BLOCK_PROCESSING_TIME,
CIRCUIT_SIMULATION_TIME,
CIRCUIT_INPUT_SIZE,
CIRCUIT_OUTPUT_SIZE,
NOTE_SUCCESSFUL_DECRYPTING_TIME,
NOTE_TRIAL_DECRYPTING_TIME,
L2_BLOCK_BUILD_TIME,
L2_BLOCK_ROLLUP_SIMULATION_TIME,
L2_BLOCK_PUBLIC_TX_PROCESS_TIME,
NODE_HISTORY_SYNC_TIME,
NOTE_HISTORY_SUCCESSFUL_DECRYPTING_TIME,
NOTE_HISTORY_TRIAL_DECRYPTING_TIME,
NODE_DB_SIZE,
PXE_DB_SIZE,
} = require("./benchmark_shared.js");

const METRICS_GROUPED_BY_ROLLUP_SIZE = [
L1_ROLLUP_CALLDATA_SIZE_IN_BYTES,
L1_ROLLUP_CALLDATA_GAS,
L1_ROLLUP_EXECUTION_GAS,
L2_BLOCK_PROCESSING_TIME,
NOTE_SUCCESSFUL_DECRYPTING_TIME,
NOTE_TRIAL_DECRYPTING_TIME,
L2_BLOCK_BUILD_TIME,
L2_BLOCK_ROLLUP_SIMULATION_TIME,
L2_BLOCK_PUBLIC_TX_PROCESS_TIME,
];

const METRICS_GROUPED_BY_CHAIN_LENGTH = [
NODE_HISTORY_SYNC_TIME,
NOTE_HISTORY_SUCCESSFUL_DECRYPTING_TIME,
NOTE_HISTORY_TRIAL_DECRYPTING_TIME,
NODE_DB_SIZE,
PXE_DB_SIZE,
];

const METRICS_GROUPED_BY_CIRCUIT_NAME = [
CIRCUIT_SIMULATION_TIME,
CIRCUIT_INPUT_SIZE,
CIRCUIT_OUTPUT_SIZE,
];

function formatValue(value) {
return value;
}

// Returns the md content to post
function getContent() {
const benchmark = JSON.parse(fs.readFileSync(BENCHMARK_FILE_JSON, "utf-8"));
delete benchmark.timestamp;
function transpose(obj) {
const transposed = {};
for (const outerKey in obj) {
const innerObj = obj[outerKey];
for (const innerKey in innerObj) {
if (!transposed[innerKey]) transposed[innerKey] = {};
transposed[innerKey][outerKey] = innerObj[innerKey];
}
}
return transposed;
}

function pick(benchmark, keys) {
const result = {};
for (const key of keys) {
result[key] = benchmark[key];
}
return result;
}

const sizes = ROLLUP_SIZES;
const header = `| Metric | ${sizes.map((i) => `${i} txs`).join(" | ")} |`;
const separator = `| - | ${sizes.map(() => "-").join(" | ")} |`;
const rows = Object.keys(benchmark).map((key) => {
function getTableContent(benchmark, groupUnit = "", col1Title = "Metric") {
const rowKeys = Object.keys(benchmark);
const groups = [
...new Set(rowKeys.flatMap((key) => Object.keys(benchmark[key]))),
];
console.log(groups);
const header = `| ${col1Title} | ${groups
.map((i) => `${i} ${groupUnit}`)
.join(" | ")} |`;
const separator = `| - | ${groups.map(() => "-").join(" | ")} |`;
const rows = rowKeys.map((key) => {
const metric = benchmark[key];
return `${key} | ${sizes.map((i) => metric[i]).join(" | ")} |`;
return `${key} | ${groups
.map((i) => formatValue(metric[i]))
.join(" | ")} |`;
});

return `
## Benchmark results

### Rollup published to L1

${header}
${separator}
${rows.join("\n")}
`;
}

// Returns the md content to post
function getPostContent() {
const benchmark = JSON.parse(fs.readFileSync(BENCHMARK_FILE_JSON, "utf-8"));
delete benchmark.timestamp;

return `
## Benchmark results

All benchmarks are run on txs on the \`Benchmarking\` contract on the repository. Each tx consists of a batch call to \`create_note\` and \`increment_balance\`, which guarantees that each tx has a private call, a nested private call, a public call, and a nested public call, as well as an emitted private note, an unencrypted log, and public storage read and write.

### L2 block published to L1

Each column represents the number of txs on an L2 block published to L1.
${getTableContent(pick(benchmark, METRICS_GROUPED_BY_ROLLUP_SIZE), "txs")}

### L2 chain processing

Each column represents the number of blocks on the L2 chain where each block has ${BLOCK_SIZE} txs.
${getTableContent(pick(benchmark, METRICS_GROUPED_BY_CHAIN_LENGTH), "blocks")}

### Circuits stats

Stats on running time and I/O sizes collected for every circuit run across all benchmarks.
${getTableContent(
transpose(pick(benchmark, METRICS_GROUPED_BY_CIRCUIT_NAME)),
"",
"Circuit"
)}

${COMMENT_MARK}
`;
Expand Down Expand Up @@ -61,7 +163,7 @@ async function getExistingComment() {
// Function to create or update a comment
async function upsertComment(existingCommentId) {
try {
const commentContent = getContent();
const commentContent = getPostContent();
const commentData = { body: commentContent };

const requestMethod = existingCommentId ? "PATCH" : "POST";
Expand Down
3 changes: 3 additions & 0 deletions yarn-project/circuits.js/src/structs/complete_address.ts
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ export class CompleteAddress {
public partialAddress: PartialAddress,
) {}

/** Size in bytes of an instance */
static readonly SIZE_IN_BYTES = 32 * 4;

static async create(
address: AztecAddress,
publicKey: PublicKey,
Expand Down
1 change: 1 addition & 0 deletions yarn-project/end-to-end/.gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
addresses.json
/log
/data
1 change: 1 addition & 0 deletions yarn-project/end-to-end/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@
"@types/lodash.zipwith": "^4.2.7",
"@types/memdown": "^3.0.3",
"@types/node": "^18.7.23",
"glob": "^10.3.10",
"jest": "^29.5.0",
"koa": "^2.14.2",
"koa-static": "^5.0.0",
Expand Down
Loading