-
Notifications
You must be signed in to change notification settings - Fork 960
Does not handle retries or backoff for Gitlab.com API endpoints #630
Comments
@bleggett I'm not against adding this, but I think it would be good to first describe the approach you want to take before spending too much time on it. I recently implemented something similar for another Go client, so I have some thoughts about how I would like to add support for this. |
@svanharmelen Sure. Since the response headers for 429 Too Many Requests aren't really standardized/the ones above are Gitlab specific, I just going to do something simple like
If you've got something planned or can just port over something smarter, that would be preferable. |
Please note that some usage patterns require to limit the sleep time, e.g. the user specifies a hard maximum amount of time which the call can elapse. We had similar problems with GitHub API, and we wrapped around the rate limit error by operating on a Context which can be |
This is what I mean: result, _, err := m.client.Search.Users(ctx, email+" in:email", searchOpts)
if err != nil {
if rl, ok := err.(*github.RateLimitError); ok {
logrus.Warnf("rate limit was hit, waiting until %s", rl.Rate.Reset)
select {
case <-time.After(rl.Rate.Reset.Sub(time.Now().UTC())):
case <-ctx.Done():
return "", "", context.Canceled
}
continue
}
return "", "", err
} This works perfectly for GitHub. IDK much about GitLab, but I guess it works similar. |
I have a solution for this that I can port from another project I've build recently. I just need to find a little bit of time to do so, but hopefully I can add this in the coming week of so. |
I'm getting 429s while using terraform gitlab provider, which uses go-gitlab. |
@bleggett and @vmarkovtsev I just created PR #800 in order to solve this. Would be great if you can confirm that the PR doesn't break anything and fixed the issue... Thanks! |
For those of us using Gitlab SaaS, we're quite frequently hitting the Gitlab rate limits.
Gitlab returns the following "standard" rate limit retry headers in 429 Too Many Request responses:
It would be ideal if
go-gitlab
properly interpreted these and implemented standard backoff/retry logic for Gitlab requests, as node-gitlab doesObviously it can be reimplemented every time in the surrounding code that invokes
go-gitlab
, but I think logic this basic belongs in this library.Am going to start work on a PR for this, however if there is disagreement about whether this functionality belongs in
go-gitlab
upstream or not, I'd like to hear it.The text was updated successfully, but these errors were encountered: