-
Notifications
You must be signed in to change notification settings - Fork 842
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to avoid n + 1 query #296
Comments
Hi @wangfeipeng, in the JS world you'd probably use Facebook's dataloader library. Although there are at least 2 implementations of a similar library in Golang, it would not be of much use since in graphql-go all fields are resolved sequentially. There is a PR for parallel field resolution but the current state of it is somewhat unclear. See #213 |
I believe this issue is a duplicate of #106 |
Parallel queries to the database does not solve the problem n + 1. Any suggestions? |
Hi @danielspk, thanks for reaching us for a question, that's correct parallel/concurrent queries doesn't solve the n + 1 problem, but parallel/concurrent resolvers + a batching library like |
closing this one since was addressed via: #389 |
@chris-ramon Thanks for you example. I am wondering (not sure if this is the right place) how would we pass the graphQLArgs to the dataloader? For example, if i wanted to get a list of customers with a limit.
|
my data struct like this:
{
"data": {
"customers": [{
"customerProfile": {
"nickName": "1"
},
"id": 1,
"sex": 0,
"status": 1,
"type": 1
}, {
"customerProfile": {
"nickName": "12"
},
"id": 12,
"sex": 0,
"status": 1,
"type": 0
}]
}
}
it's a list of customer , but I need to query N times to get the customerProfile like this:
it will be bad when N is a big num , how to avoid this problem
The text was updated successfully, but these errors were encountered: