Skip to content

updated aw#756

Merged
lpcox merged 1 commit intomainfrom
lpcox/fix-large-file-test
Feb 6, 2026
Merged

updated aw#756
lpcox merged 1 commit intomainfrom
lpcox/fix-large-file-test

Conversation

@lpcox
Copy link
Collaborator

@lpcox lpcox commented Feb 6, 2026

No description provided.

Copilot AI review requested due to automatic review settings February 6, 2026 17:53
@lpcox lpcox merged commit b844442 into main Feb 6, 2026
@lpcox lpcox deleted the lpcox/fix-large-file-test branch February 6, 2026 17:54
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the “Large Payload Tester” agentic workflow to support container network access and adds additional validation that the generated test secret is present in the created files.

Changes:

  • Allow the containers network group in the workflow frontmatter (and propagate to the compiled lock file).
  • Add grep checks intended to validate the secret is present in both the large JSON payload and the secret file.
  • Expand the firewall/allowed-domain configuration in the lock workflow to include Docker/container registries and related domains.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
.github/workflows/large-payload-tester.md Adds containers network access and new grep-based secret presence checks during environment setup.
.github/workflows/large-payload-tester.lock.yml Regenerates compiled workflow to reflect frontmatter changes; expands allowed domains and includes the new grep checks.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 105 to +108
echo "Large file stored in: $TEST_FS/$LARGE_PAYLOAD_FILE"
grep -H $TEST_SECRET $TEST_FS/$LARGE_PAYLOAD_FILE
echo "Secret stored in $TEST_FS/$SECRET_FILE"
grep -H $TEST_SECRET $TEST_FS/$SECRET_FILE
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The added grep -H $TEST_SECRET ... lines will print the full secret and (for the large JSON) potentially thousands of matching lines into the workflow logs, which can bloat logs and even cause failures/timeouts. Prefer a non-verbose assertion (e.g., grep -Fq/grep -c with a small, bounded output) and quote variables/paths ("$TEST_SECRET", "$TEST_FS/$LARGE_PAYLOAD_FILE") to avoid word-splitting and regex interpretation.

Copilot uses AI. Check for mistakes.
run: bash /opt/gh-aw/actions/create_gh_aw_tmp_dir.sh
- name: Setup Test Environment
run: "TEST_FS=\"/tmp/mcp-test-fs\"\nSECRET_FILE=\"secret.txt\"\nLARGE_PAYLOAD_FILE=\"large-test-file.json\"\n# Create test data directory (payload directory will be created by gateway on-demand)\nmkdir -p $TEST_FS\n\n# Generate a unique secret for this test run\n# Use uuidgen if available, otherwise use timestamp with nanoseconds for better entropy\nif command -v uuidgen >/dev/null 2>&1; then\n TEST_SECRET=\"test-secret-$(uuidgen)\"\nelse\n TEST_SECRET=\"test-secret-$(date +%s%N)-$$\"\nfi\necho $TEST_SECRET > $TEST_FS/$SECRET_FILE\n# Create a large test file (~500KB) with the secret embedded in JSON\n# This file will be read by the filesystem MCP server, causing a large payload\ncat > $TEST_FS/$LARGE_PAYLOAD_FILE <<'EOF'\n{\n \"test_run_id\": \"PLACEHOLDER_RUN_ID\",\n \"test_timestamp\": \"PLACEHOLDER_TIMESTAMP\",\n \"purpose\": \"Testing large MCP payload storage and retrieval\",\n \"data\": {\n \"large_array\": [],\n \"metadata\": {\n \"generated_by\": \"large-payload-tester workflow\",\n \"repository\": \"PLACEHOLDER_REPO\",\n \"workflow_run_url\": \"PLACEHOLDER_URL\"\n }\n },\n \"padding\": \"\"\n}\nEOF\n\n# Use jq to properly populate the JSON with dynamic values and generate large array\n# Generating 2000 items + 400KB padding to create ~500KB file\njq --arg secret \"$TEST_SECRET\" \\\n --arg run_id \"${{ github.run_id }}\" \\\n --arg timestamp \"$(date -Iseconds)\" \\\n --arg repo \"${{ github.repository }}\" \\\n --arg url \"${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}\" \\\n '.test_run_id = $run_id | \n .test_timestamp = $timestamp | \n .data.metadata.repository = $repo | \n .data.metadata.workflow_run_url = $url | \n .data.large_array = [range(2000) | {id: ., value: (\"item-\" + tostring), secret_reference: $secret, extra_data: (\"data-\" + tostring + \"-\" * 50)}] |\n .padding = (\"X\" * 400000)' \\\n $TEST_FS/$LARGE_PAYLOAD_FILE > $TEST_FS/$LARGE_PAYLOAD_FILE.tmp\n\nmv $TEST_FS/$LARGE_PAYLOAD_FILE.tmp $TEST_FS/$LARGE_PAYLOAD_FILE\n\n# Verify file was created and is large enough\nFILE_SIZE=$(wc -c < $TEST_FS/$LARGE_PAYLOAD_FILE)\necho \"Created $LARGE_PAYLOAD_FILE with size: $FILE_SIZE bytes (~$(($FILE_SIZE / 1024))KB)\"\nif [ \"$FILE_SIZE\" -lt 512000 ]; then\n echo \"WARNING: Test file is smaller than expected ($FILE_SIZE bytes < 500KB)\"\n echo \"Continuing with test anyway...\"\nfi\n\necho \"Test environment setup complete\"\necho \"Large file stored in: $TEST_FS/$LARGE_PAYLOAD_FILE\"\necho \"Secret stored in $TEST_FS/$SECRET_FILE\"\n"
run: "TEST_FS=\"/tmp/mcp-test-fs\"\nSECRET_FILE=\"secret.txt\"\nLARGE_PAYLOAD_FILE=\"large-test-file.json\"\n# Create test data directory (payload directory will be created by gateway on-demand)\nmkdir -p $TEST_FS\n\n# Generate a unique secret for this test run\n# Use uuidgen if available, otherwise use timestamp with nanoseconds for better entropy\nif command -v uuidgen >/dev/null 2>&1; then\n TEST_SECRET=\"test-secret-$(uuidgen)\"\nelse\n TEST_SECRET=\"test-secret-$(date +%s%N)-$$\"\nfi\necho $TEST_SECRET > $TEST_FS/$SECRET_FILE\n# Create a large test file (~500KB) with the secret embedded in JSON\n# This file will be read by the filesystem MCP server, causing a large payload\ncat > $TEST_FS/$LARGE_PAYLOAD_FILE <<'EOF'\n{\n \"test_run_id\": \"PLACEHOLDER_RUN_ID\",\n \"test_timestamp\": \"PLACEHOLDER_TIMESTAMP\",\n \"purpose\": \"Testing large MCP payload storage and retrieval\",\n \"data\": {\n \"large_array\": [],\n \"metadata\": {\n \"generated_by\": \"large-payload-tester workflow\",\n \"repository\": \"PLACEHOLDER_REPO\",\n \"workflow_run_url\": \"PLACEHOLDER_URL\"\n }\n },\n \"padding\": \"\"\n}\nEOF\n\n# Use jq to properly populate the JSON with dynamic values and generate large array\n# Generating 2000 items + 400KB padding to create ~500KB file\njq --arg secret \"$TEST_SECRET\" \\\n --arg run_id \"${{ github.run_id }}\" \\\n --arg timestamp \"$(date -Iseconds)\" \\\n --arg repo \"${{ github.repository }}\" \\\n --arg url \"${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}\" \\\n '.test_run_id = $run_id | \n .test_timestamp = $timestamp | \n .data.metadata.repository = $repo | \n .data.metadata.workflow_run_url = $url | \n .data.large_array = [range(2000) | {id: ., value: (\"item-\" + tostring), secret_reference: $secret, extra_data: (\"data-\" + tostring + \"-\" * 50)}] |\n .padding = (\"X\" * 400000)' \\\n $TEST_FS/$LARGE_PAYLOAD_FILE > $TEST_FS/$LARGE_PAYLOAD_FILE.tmp\n\nmv $TEST_FS/$LARGE_PAYLOAD_FILE.tmp $TEST_FS/$LARGE_PAYLOAD_FILE\n\n# Verify file was created and is large enough\nFILE_SIZE=$(wc -c < $TEST_FS/$LARGE_PAYLOAD_FILE)\necho \"Created $LARGE_PAYLOAD_FILE with size: $FILE_SIZE bytes (~$(($FILE_SIZE / 1024))KB)\"\nif [ \"$FILE_SIZE\" -lt 512000 ]; then\n echo \"WARNING: Test file is smaller than expected ($FILE_SIZE bytes < 500KB)\"\n echo \"Continuing with test anyway...\"\nfi\n\necho \"Test environment setup complete\"\necho \"Large file stored in: $TEST_FS/$LARGE_PAYLOAD_FILE\"\ngrep -H $TEST_SECRET $TEST_FS/$LARGE_PAYLOAD_FILE\necho \"Secret stored in $TEST_FS/$SECRET_FILE\"\ngrep -H $TEST_SECRET $TEST_FS/$SECRET_FILE\n"
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the generated workflow, the new grep -H $TEST_SECRET ... commands are embedded in the run: string and will dump the secret and a large amount of matching JSON into logs (secret appears in each array element). This can significantly increase log volume and risk hitting log limits. Consider changing the source .md to use a quiet check (grep -Fq/grep -c + bounded output) and proper quoting so the compiled lock file stays safe.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant