-
-
Notifications
You must be signed in to change notification settings - Fork 686
Increase maximum retry count for deploy run controller #1450
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: 1b023f6 The changes in this PR will be included in the next version bump. This PR includes changesets to release 10 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThe changes in this pull request increase the maximum retry count for deploy run controller operations in the Changes
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
.changeset/poor-parrots-attend.md (1)
1-5
: Enhance the changeset description with more details.While the current description captures the main change, it would be more helpful to include additional details about the changes:
--- "trigger.dev": patch --- -Increase max retry count for deploy run controller operations +Increase max retry count for deploy run controller operations from 5 to 7 + +- Add jitter and operation timeouts +- Implement retry window of 2-6 minutes +- Enhance error logging for failures +- Improve task run process to prevent memory leaks
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
.changeset/poor-parrots-attend.md
(1 hunks)packages/cli-v3/src/entryPoints/deploy-run-controller.ts
(2 hunks)
🔇 Additional comments (2)
packages/cli-v3/src/entryPoints/deploy-run-controller.ts (2)
421-421
: LGTM: Enhanced retry mechanism for lazy attempts.
The increase in maxRetries
from 3 to 7 with a minimum delay of 10ms provides better resilience for lazy attempt operations.
Let's verify the retry timing calculations:
#!/bin/bash
# Description: Check for other retry configurations in lazy attempt related code
# Test: Search for other lazy attempt retry configurations
rg -A 5 'readyForLazyAttempt|READY_FOR_LAZY_ATTEMPT'
46-46
: LGTM: Increased retry attempts for better reliability.
The increase in maxRetries
from 5 to 7 with full jitter aligns with the PR objectives to implement longer retry durations.
Let's verify the retry configuration is consistent across the codebase:
✅ Verification successful
Let me gather more information to verify the retry configurations.
Retry configuration is consistent with the codebase patterns
The increase in maxRetries
from 5 to 7 in defaultBackoff
is appropriate as each usage of ExponentialBackoff
in the codebase follows its own specific retry strategy based on its use case:
- The default configuration (7 retries with FullJitter) for general operations
- Task replay with 3 retries and FullJitter
- Checkpointing with EqualJitter and time-based limits
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for other ExponentialBackoff configurations that might need similar updates
# Test: Search for ExponentialBackoff configurations
rg -A 3 'new ExponentialBackoff\('
Length of output: 7717
Script:
#!/bin/bash
# Search for ExponentialBackoff configurations and their retry settings
rg -A 5 'new ExponentialBackoff\(' --type ts
Length of output: 1934
@trigger.dev/build
@trigger.dev/sdk
@trigger.dev/react-hooks
trigger.dev
@trigger.dev/core
commit: |
With jitter and operation timeouts we should now retry for up to 2-6m. This will improve resilience to any intermittent issues up and down the chain.
Summary by CodeRabbit
New Features
Bug Fixes