Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: retry requests by default #104

Merged
merged 1 commit into from
Mar 29, 2019
Merged

feat: retry requests by default #104

merged 1 commit into from
Mar 29, 2019

Conversation

JustinBeckwith
Copy link
Contributor

@JustinBeckwith JustinBeckwith commented Mar 23, 2019

BREAKING CHANGE: This change enables HTTP retries by default. The retry logic matches the defaults for gaxios:

{
  // The amount of time to initially delay the retry
  retryDelay: 100;

  // The HTTP Methods that will be automatically retried.
  httpMethodsToRetry: ['GET','PUT','HEAD','OPTIONS','DELETE']

  // The HTTP response status codes that will automatically be retried.
  statusCodesToRetry: [[100, 199], [429, 429], [500, 599]];
}

The behavior can be disabled by setting retry to false in the request config.

Fixes googleapis/google-api-nodejs-client#482.

@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Mar 23, 2019
@JustinBeckwith JustinBeckwith requested review from bcoe and jkwlui March 23, 2019 04:21
@codecov
Copy link

codecov bot commented Mar 23, 2019

Codecov Report

Merging #104 into master will increase coverage by 0.23%.
The diff coverage is 0%.

Impacted file tree graph

@@           Coverage Diff            @@
##           master   #104      +/-   ##
========================================
+ Coverage   76.76%    77%   +0.23%     
========================================
  Files           2      2              
  Lines          99    100       +1     
  Branches       19     20       +1     
========================================
+ Hits           76     77       +1     
  Misses         16     16              
  Partials        7      7
Impacted Files Coverage Δ
src/apirequest.ts 76.53% <0%> (+0.24%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update aefab0d...b05b571. Read the comment docs.

@JustinBeckwith JustinBeckwith requested a review from a team March 29, 2019 03:56
Copy link
Contributor

@bcoe bcoe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with two caveats:

  • I used to use restify with retry = true in conjunction with nock; it made tests really hard to write -- your unit-tests will potentially be happily retrying operations over and over, and not testing a failure scenario.
    • it might be worth disabling this behavior by default in tests, unless you explicitly write a test around it.
  • it would also be nice if a warning was output to the console, if a retry is necessary; otherwise I've seen situations where the CLI feels hung (and it's actually happily trying a broken API endpoint over and over again).

All this said, I love that the npm CLI retries failed operations, it ultimately feels like a much more resilient API.

@JustinBeckwith JustinBeckwith merged commit 1defd62 into master Mar 29, 2019
@stephenplusplus stephenplusplus deleted the retry branch March 29, 2019 18:28
@mattcobb
Copy link

This does not address 403 "... Limit Exceeded ..." retries. Is there a way to get at the gaxios retryConfig from google api?

@JustinBeckwith
Copy link
Contributor Author

Greetings, could I trouble you to open a new issue so we don't lose track of this?

@GSDan
Copy link

GSDan commented Jul 7, 2020

Hi - curious if there's a reason why 503 codes aren't being retried?

@JustinBeckwith
Copy link
Contributor Author

Just doesn't by default 🤷 At some point we should look into generalized retry-after header support. You can configure the list of retry status though! You can pass in your own GaxiosOptions which includes a retry config.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes This human has signed the Contributor License Agreement.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement exponential backoff when executing request
5 participants