-
Notifications
You must be signed in to change notification settings - Fork 232
Description
Concurrency Safety Issue in add_labels
Severity: CRITICAL
Tool: add_labels
File: actions/setup/js/add_labels.cjs:30
Analysis Date: 2026-02-11
Pattern: Appears in 28+ other tool handlers
Summary
The add_labels tool handler uses a closure variable processedCount to track the number of processed items against a maxCount limit. This counter is shared mutable state that is not protected by any synchronization mechanism, creating a race condition when multiple concurrent tool invocations occur. The max limit can be bypassed, and the actual count can be undercounted due to lost updates.
Issue Details
Type: Shared Mutable State / Race Condition
Location: actions/setup/js/add_labels.cjs:30
Code Pattern:
async function main(config = {}) {
const allowedLabels = config.allowed || [];
const maxCount = config.max || 10;
// ❌ UNSAFE: Mutable counter shared across all handler invocations
let processedCount = 0;
return async function handleAddLabels(message, resolvedTemporaryIds) {
// ❌ RACE CONDITION: Check-then-act without synchronization
if (processedCount >= maxCount) {
core.warning(`Skipping add_labels: max count of ${maxCount} reached`);
return { success: false, error: `Max count of ${maxCount} reached` };
}
// ❌ NON-ATOMIC: Increment not protected
processedCount++;
// ... rest of handler logic
};
}Race Condition Scenario:
- Thread A calls
handleAddLabelsat time T (processedCount=5) - Thread B calls
handleAddLabelsat time T+1ms (processedCount=5) - Both threads read
processedCount=5and pass the< 10check - Thread A increments:
processedCount=6 - Thread B increments:
processedCount=6← Lost update! Should be 7 - Both operations succeed, but the count is incorrect
Worse case - max limit bypass:
// Scenario: maxCount=10, processedCount=9
// T=0ms: Call 1 reads processedCount=9, check passes (9 < 10) ✓
// T=1ms: Call 2 reads processedCount=9, check passes (9 < 10) ✓
// T=2ms: Call 3 reads processedCount=9, check passes (9 < 10) ✓
// T=3ms: All 3 calls increment and succeed
// Result: 12 total operations when max was 10 ❌Detailed Analysis
Root Cause
This is a classic check-then-act race condition combined with a lost update problem:
- Check-then-act: The pattern
if (processedCount >= maxCount)followed byprocessedCount++is not atomic - Lost updates: Multiple threads can read the same value before any increment occurs
- No synchronization: JavaScript's single-threaded event loop doesn't help here because:
- Each handler invocation is an
asyncfunction - The
awaitpoints (e.g., GitHub API calls) create async boundaries - Multiple handlers can be "in flight" concurrently
- Each handler invocation is an
Concurrent Execution Example
// With maxCount=10, processedCount=8:
// Concurrent Call Timeline:
// ========================
// T=0ms: Call A starts, reads processedCount=8, check passes
// T=1ms: Call B starts, reads processedCount=8, check passes
// T=2ms: Call C starts, reads processedCount=8, check passes
// T=10ms: Call A increments: processedCount=9
// T=11ms: Call B increments: processedCount=9 (lost update!)
// T=12ms: Call C increments: processedCount=9 (lost update!)
//
// Expected: processedCount=11 (should have rejected 1 call)
// Actual: processedCount=9 (2 lost updates, all 3 calls succeeded)
``````
### Impact Assessment
- **Data Integrity**: The `processedCount` does not accurately reflect the number of operations
- **Security**: Max limits can be bypassed, potentially allowing resource exhaustion
- **Reliability**: Non-deterministic behavior depending on timing of concurrent calls
- **Widespread**: This pattern appears in **28+ tool handlers** (see grep output below)
### Affected Files
All these handlers use the same `processedCount` pattern:
``````
add_labels.cjs link_sub_issue.cjs
add_comment.cjs missing_data.cjs
add_reviewer.cjs missing_tool.cjs
assign_milestone.cjs noop_handler.cjs
assign_to_user.cjs push_to_pull_request_branch.cjs
autofix_code_scanning_alert.cjs remove_labels.cjs
close_discussion.cjs update_project.cjs
close_issue.cjs create_code_scanning_alert.cjs
close_pull_request.cjs create_discussion.cjs
create_issue.cjs create_missing_data_issue.cjs
create_missing_tool_issue.cjs create_pr_review_comment.cjs
create_project.cjs create_project_status_update.cjs
create_pull_request.cjs dispatch_workflow.cjs
hide_comment.cjs mark_pull_request_as_ready_for_review.cjsRecommended Fix
Approach: Redesign to use per-invocation state tracking instead of shared mutable counters
Option 1: Pass context with operation count (RECOMMENDED)
// ✅ SAFE: Track count in the calling context, not in handler closure
async function main(config = {}) {
const allowedLabels = config.allowed || [];
const maxCount = config.max || 10;
core.info(`Add labels configuration: max=${maxCount}`);
if (allowedLabels.length > 0) {
core.info(`Allowed labels: ${allowedLabels.join(", ")}`);
}
// No mutable closure variable!
/**
* Message handler function that processes a single add_labels message
* `@param` {Object} message - The add_labels message to process
* `@param` {Object} resolvedTemporaryIds - Map of temporary IDs to {repo, number}
* `@param` {Object} context - Execution context with operation tracking
* `@returns` {Promise(Object)} Result with success/error status
*/
return async function handleAddLabels(message, resolvedTemporaryIds, context = {}) {
// Check if we've hit the max limit using context counter
const currentCount = context.operationCounts?.add_labels || 0;
if (currentCount >= maxCount) {
core.warning(`Skipping add_labels: max count of ${maxCount} reached`);
return {
success: false,
error: `Max count of ${maxCount} reached`,
};
}
// Increment in the shared context (caller's responsibility to synchronize)
if (context.operationCounts) {
context.operationCounts.add_labels = currentCount + 1;
}
// ... rest of handler logic (unchanged)
};
}Explanation:
- The count is tracked in a
contextobject passed by the caller - The caller orchestrates all handler invocations and manages synchronization
- Each handler is stateless - no shared mutable closure variables
- The caller can use sequential processing or proper locking if concurrent processing is needed
Option 2: Pre-validate message count before processing
// ✅ SAFE: Count messages upfront, reject excess before processing
async function processAllMessages(messages, handlers, config) {
// Group messages by type
const messagesByType = {};
for (const msg of messages) {
const type = msg.type || 'unknown';
if (!messagesByType[type]) {
messagesByType[type] = [];
}
messagesByType[type].push(msg);
}
// Validate counts against limits BEFORE processing
for (const [type, msgs] of Object.entries(messagesByType)) {
const maxCount = config[type]?.max || 10;
if (msgs.length > maxCount) {
core.warning(`Too many ${type} messages: ${msgs.length} > ${maxCount}`);
// Truncate or reject excess messages
messagesByType[type] = msgs.slice(0, maxCount);
}
}
// Process messages sequentially (or with proper synchronization)
const results = [];
for (const [type, msgs] of Object.entries(messagesByType)) {
const handler = handlers[type];
for (const msg of msgs) {
const result = await handler(msg, resolvedTemporaryIds);
results.push(result);
}
}
return results;
}Explanation:
- Count all messages by type upfront
- Enforce limits before any processing begins
- No need for runtime counters - the limit is enforced statically
- Handlers become completely stateless
Alternative Solutions
Option 3: Use atomic counters (Node.js Atomics)
// Uses SharedArrayBuffer and Atomics for true atomic operations
const sab = new SharedArrayBuffer(4);
const counter = new Int32Array(sab);
async function main(config = {}) {
const maxCount = config.max || 10;
return async function handleAddLabels(message, resolvedTemporaryIds) {
// Atomic fetch-and-increment
const currentCount = Atomics.add(counter, 0, 1);
if (currentCount >= maxCount) {
core.warning(`Skipping add_labels: max count of ${maxCount} reached`);
return { success: false, error: `Max count of ${maxCount} reached` };
}
// ... rest of handler logic
};
}Pros:
- True atomic operations
- Minimal code changes
Cons:
- Requires SharedArrayBuffer (may not be available in all environments)
- Adds complexity for a simple counter
- Still has check-then-act race (atomic increment but non-atomic check)
Option 4: Mutex/Lock Pattern
const { Mutex } = require('async-mutex');
const mutex = new Mutex();
async function main(config = {}) {
const maxCount = config.max || 10;
let processedCount = 0;
return async function handleAddLabels(message, resolvedTemporaryIds) {
return await mutex.runExclusive(async () => {
if (processedCount >= maxCount) {
core.warning(`Skipping add_labels: max count of ${maxCount} reached`);
return { success: false, error: `Max count of ${maxCount} reached` };
}
processedCount++;
// ... rest of handler logic
});
};
}Pros:
- Guarantees mutual exclusion
- Prevents all race conditions
Cons:
- Requires external dependency (
async-mutex) - Serializes ALL handler calls (performance impact)
- Overkill for a simple counter
Recommended Implementation Steps
- Choose Option 1 (context-based tracking) as the cleanest solution
- Modify the handler factory pattern to accept and update a context object
- Update the message processor to create and pass the context
- Apply the same fix to all 28+ affected handlers
- Add concurrency tests to verify the fix
Testing Strategy
// Test concurrent execution
describe('add_labels concurrency safety', () => {
test('handles concurrent calls without race conditions', async () => {
const config = { max: 10 };
const handler = await main(config);
// Create shared context for tracking
const context = { operationCounts: {} };
// Launch 15 concurrent calls (max is 10)
const promises = Array(15).fill(0).map((_, i) =>
handler(
{ labels: [`label-${i}`], item_number: 1 },
{},
context
)
);
const results = await Promise.all(promises);
// Verify exactly 10 succeeded and 5 failed
const succeeded = results.filter(r => r.success).length;
const failed = results.filter(r => !r.success).length;
expect(succeeded).toBe(10);
expect(failed).toBe(5);
expect(context.operationCounts.add_labels).toBe(10);
});
test('sequential execution respects max limit', async () => {
const config = { max: 5 };
const handler = await main(config);
const context = { operationCounts: {} };
// Process 10 items sequentially
const results = [];
for (let i = 0; i < 10; i++) {
const result = await handler(
{ labels: [`label-${i}`], item_number: 1 },
{},
context
);
results.push(result);
}
// First 5 succeed, last 5 fail
expect(results.slice(0, 5).every(r => r.success)).toBe(true);
expect(results.slice(5).every(r => !r.success)).toBe(true);
});
});References
- JavaScript Concurrency: [MDN - Concurrency model and Event Loop]((developer.mozilla.org/redacted)
- Async/Await Pitfalls: Node.js Best Practices - Async Patterns
- Race Condition Patterns: [OWASP - Race Conditions]((owasp.org/redacted)
Priority: P0-Critical
Effort: Large (requires updating 28+ handler files)
Expected Impact: Prevents race conditions, ensures accurate max limit enforcement, improves data integrity across all safe-output handlers
AI generated by Daily MCP Tool Concurrency Analysis
- expires on Feb 18, 2026, 9:23 AM UTC