Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds Variable Batching Proposal #307

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

Conversation

michaelstaib
Copy link
Member

Within the Composite Schema WG we have discussed a new batching format that is in the first place meant for Subgraphs/Source Schema in a federated graph. This new variable batching allows the distributed GraphQL executor to execute the same operation with a set of variables.

graphql/composite-schemas-spec#25

@Shane32
Copy link
Contributor

Shane32 commented Aug 29, 2024

Does the distributed GraphQL executor currently support 'regular' batching requests?

@michaelstaib
Copy link
Member Author

@Shane32 we will specify this as well ... we call it at the moment request batching ... the idea is that a request batch can also consist of variable batches.

@Shane32
Copy link
Contributor

Shane32 commented Aug 29, 2024

So it would support variable batching within request batching then?

@michaelstaib
Copy link
Member Author

michaelstaib commented Aug 29, 2024

Yes, this is the current discussion. There are a lot of constraints we will put in place for the first iteration of this, we have explored this also in combination with subscriptions and all. But for this initial appendix we are focusing on variable batching first as this will be the minimum requirement for the composite schema spec.

@Shane32
Copy link
Contributor

Shane32 commented Aug 29, 2024

Ok. I would suggest that we do not use the jsonl format.

  1. If there is variable batching within request batching, there is no clear way to format such a response. Obviously wrapping a jsonl response in a JSON list produces unparsable JSON due to the missing commas:
[
{"data":{"hello":"world-1"}}
{"data":{"hello":"world-2"}}
,
{"data":{"name":"john-1"}}
{"data":{"name":"john-2"}}
]

Perhaps it could be specified that when request batching was layered on top of variable batching, then the lists are flattened. I'm not sure this is the best approach, but it's feasible.

  1. If this scenario is not supported, implying that the request is request batching OR variable batching, then there is no reason to differentiate the response formats.

  2. Parsing jsonl is not common in JSON environments, at least not the ones I'm familiar with. Attempting to parse such a response within .NET will throw an exception upon encountering the next line in the jsonl response. Similarly the JSON.parse method of javascript does not support jsonl and will throw. In any of these environments, one would have to code another layer to separate the responses before decoding the results, adding code complexity in addition to having a likely-slower implementation. It seems much cleaner and easier to parse the JSON and then iterate through the response list.

Let's use the response format that is commonplace now and supported by various servers and clients alike.

Perhaps as a separate appendix, the jsonl format is described as an optional response format for batching requests (request batching or variable batching). There it can state that if multiple batching approaches are used, the lists are flattened. But I just don't see the benefit of adding another response format.

@michaelstaib
Copy link
Member Author

michaelstaib commented Aug 29, 2024

There actually is we have specified that there is a requestIndex and a variableIndex in the response structure ... we do need these also as there is @defer and @stream involved. Every execution could yield to a stream of responses. We have this also already implemented with reference implementations ... I will get through the spec text this week and then you will see how this plays out.

@michaelstaib
Copy link
Member Author

BTW ... we will introduce requestIndex and variableIndex also since the server should be able reorder responses and not make the consumer wait just in order to return completed results in order.

@michaelstaib
Copy link
Member Author

@Shane32 I have put a bit more about the response in.

@michaelstaib michaelstaib self-assigned this Aug 29, 2024
@@ -0,0 +1,170 @@
## B. Appendix: Variable Batching
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After discussion in the composite schema working group we will have a single appendix about batching as this would allow for better example that show also both request batching and variable batching in combination.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@benjie shall we do also some example in here with defer and stream ... I asked that because defer and stream is not yet handled in the spec.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For now: no; but would be good to have the text handy anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants