Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix #311: RetryingChannel propagates cancelation #368

Merged
merged 3 commits into from
Feb 18, 2020

Conversation

carterkozak
Copy link
Contributor

Generally safer by using existing utility functionality to chain
futures instead of reimplementing async-transformation via
callbacks.

After this PR

==COMMIT_MSG==
RetryingChannel propagates cancelation
==COMMIT_MSG==

Generally safer by using existing utility functionality to chain
futures instead of reimplementing async-transformation via
callbacks.
@@ -46,61 +44,59 @@

@Override
public ListenableFuture<Response> execute(Endpoint endpoint, Request request) {
SettableFuture<Response> future = SettableFuture.create();

IntFunction<ListenableFuture<Response>> callSupplier = attempt -> {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replaced this abstraction by passing the channel, endpoint, and request into the retryer. We still have the attempt index available for instrumentation (RetryingCallback.failures), but I find it's easier to spot bugs without the indirection.

boolean setSuccessfully = delegate.set(response);
if (!setSuccessfully) {
response.close();
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No longer have to worry about this case because cancel is propagated through the future chain.

}
return Futures.immediateFailedFuture(
throwable.orElseGet(() -> new SafeRuntimeException("Retries exhausted")));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might be nice to know what caused us to retry

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good call, might as well add some helpful info logging prior to each retry while I'm here.

}

private ListenableFuture<Response> wrap(ListenableFuture<Response> input) {
return Futures.catchingAsync(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is really neat!

@carterkozak carterkozak merged commit fbdb33f into ckozak/fix_retry Feb 18, 2020
@carterkozak carterkozak deleted the ckozak/fix_retry_pt2 branch February 18, 2020 20:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants