-
Notifications
You must be signed in to change notification settings - Fork 517
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes #294. #321
Fixes #294. #321
Conversation
|
I've now signed the CLA ^. |
can you run |
No worries! That's done, and I also created #322 to add this info to |
awesome! @thekevinbrown I recently have been running into some |
No worries, I'm working off my own local copy anyway so no stress there. If there's anything I can do to help just let me know! |
Hey @saihaj, just wanted to check in and see how the publish is going. Is there anything I can do to help? |
I am waiting for @leebyron to update some settings in GH org |
@leebyron, any progress? |
I'm not sure why version deploy is broken, but I added @saihaj as an NPM publisher so hopefully you can help debug |
yes I received NPM access thanks @leebyron will try to get a release out sometime this week @thekevinbrown |
@saihaj, sorry to pester, just want to check in and see if there's anything I can do to help? |
let's use this issue to chat update since there are a few things that we will rollout #328 (comment) |
Awesome, thanks for your work @saihaj! Much appreciated. |
…e now that graphql/dataloader#321 was merged and released.
…e now that graphql/dataloader#321 was merged and released.
…e now that graphql/dataloader#321 was merged and released.
…e now that graphql/dataloader#321 was merged and released.
This PR fixes #294.
It should be noted that this also amends the
max batch size respects cached results
test. A side effect of this change is that it slightly changes the moment when the first batch is resolved. DataLoader still does respect cached results, and the load call in that test is still only called to request2
, so I didn't think this was a problem.Happy to do this a different / better way if anyone has any ideas, but this seemed the most straightforward to me.