Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Await on graphql.execute() before calling executionDidEnd handler (AS-59) #2298

Merged
merged 2 commits into from
Feb 12, 2019

Conversation

cheapsteak
Copy link
Member

@cheapsteak cheapsteak commented Feb 11, 2019

extensionStack's executionDidEnd handler (the callback function returned from executionDidStart) was being called before graphql.execute finishes running

This PR adds an await before the return so code in finally runs after graphql.execute finishes

TODO:

  • Update CHANGELOG.md with your change (include reference to issue & this PR)
  • Make sure all of the significant new logic is covered by tests
  • Rebase your changes on master so that they can be merged easily
  • Make sure all tests and linter rules pass

@cheapsteak cheapsteak changed the title Await on graphql.execute() before calling executionDidEnd handler Await on graphql.execute() before calling executionDidEnd handler (AS-59) Feb 11, 2019
Copy link
Member

@benjamn benjamn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense!

It's usually not necessary to await a promise before returning it from an async function, because it will just get rolled into the final promise, but in this case the timing of the finally block is what matters.

Copy link
Member

@abernix abernix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great find and LGTM.

As we discussed on our call: Were this execute used anywhere else besides the place where it's used, it might have been tempting to suggest wrapping the execution in a return-'d Promise.resolve with a .then(onFulfilled, onRejected) where fulfillment or rejection both called executionDidEnd() to preserve the async dynamics, but given the limited scope, this is great.

Also as discussed, a test which ensures that the duration amounts in the traces mathematically match up would be welcomed either in this PR or as a follow-up (since we could use more of those in general). Your call!

@cheapsteak cheapsteak force-pushed the chang/as-59-inaccurate-tracing-duration branch from 82f7138 to 8b9af1c Compare February 12, 2019 18:27
@cheapsteak
Copy link
Member Author

cheapsteak commented Feb 12, 2019

Thanks for the reviews!
And appreciate the leeway on whether to do tests as part of this PR or separately. I think I'd prefer to split the addition of the tests into a separate PR (working on that now), as there might be quite a few some things in the tests that could warrant further discussion, whereas the fix itself is fairly straightforward

@abernix abernix merged commit bf12001 into master Feb 12, 2019
@abernix abernix deleted the chang/as-59-inaccurate-tracing-duration branch February 12, 2019 18:35
cheapsteak added a commit that referenced this pull request Feb 12, 2019
cheapsteak added a commit that referenced this pull request Feb 12, 2019
cheapsteak added a commit that referenced this pull request Feb 12, 2019
abernix pushed a commit that referenced this pull request Feb 13, 2019
* test: ensure "total duration" > "duration of resolvers"

Add test for #2298

* Update tracing duration test to be more comprehensive

Previously we were only guaranteeing that individual resolvers didn't have a duration longer than the total duration. The updated test guarantees that the duration (from when the first resolver started to when the last resolver ended) should be less than total duration

* Add another resolver to tracing test

* Prefer `const` over `let` for variables which aren't mutated.
@dragonfriend0013
Copy link

Glad you guys found and fixed this so quickly! I was scratching my head for hours on this before I contacted about the issue. Eagerly awaiting the new version for my own testing!!

@abernix
Copy link
Member

abernix commented Feb 15, 2019

@dragonfriend0013 This has been released in 2.4.2!

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 16, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants