You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
The GraphQL queries USER_LIST and ORGANIZATION_CONNECTION_LIST (and potentially others) do not enforce pagination limits on the first and skip parameters. This allows clients to request excessively large datasets, leading to potential denial-of-service (DoS) attacks by overwhelming server resources.
To Reproduce
Steps to reproduce the behavior:
1.Send a USER_LIST query with first set to an extremely high value (e.g., first=100000).
2.Observe the server attempting to fetch and process a massive number of records.
3.Repeat the same for ORGANIZATION_CONNECTION_LIST or other paginated queries.
4.Monitor server performance degradation or potential crashes due to resource exhaustion.
Expected behavior
The server should enforce a reasonable maximum limit (e.g., 100 records) for the first parameter and reject invalid values (e.g., negative numbers). Queries requesting more than the maximum limit should automatically be capped, and the server should return a manageable dataset without compromising performance.
Actual behavior
The server processes queries with unrestricted first and skip values, allowing clients to request excessively large datasets. This leads to high resource consumption, slow response times, and potential server crashes, negatively impacting other users and services.
Screenshots
N/A (This is a backend issue related to query handling and resource management.)
Additional details
This issue should be resolved by implementing server-side restrictions on the first and skip parameters in the GraphQL resolvers. The maximum limit for first should be set to a reasonable value (e.g., 100 records), and the skip parameter should be validated to ensure it is non-negative.
Potential internship candidates
Please read this if you are planning to apply for a Palisadoes Foundation internship
Describe the bug
The
GraphQL
queriesUSER_LIST
andORGANIZATION_CONNECTION_LIST
(and potentially others) do not enforce pagination limits on thefirst
andskip
parameters. This allows clients to request excessively large datasets, leading to potential denial-of-service (DoS) attacks by overwhelming server resources.To Reproduce
Steps to reproduce the behavior:
1.Send a
USER_LIST
query withfirst
set to an extremely high value (e.g.,first=100000
).2.Observe the server attempting to fetch and process a massive number of records.
3.Repeat the same for
ORGANIZATION_CONNECTION_LIST
or other paginated queries.4.Monitor server performance degradation or potential crashes due to resource exhaustion.
Expected behavior
The server should enforce a reasonable maximum limit (e.g., 100 records) for the
first
parameter and reject invalid values (e.g., negative numbers). Queries requesting more than the maximum limit should automatically be capped, and the server should return a manageable dataset without compromising performance.Actual behavior
The server processes queries with unrestricted
first
andskip
values, allowing clients to request excessively large datasets. This leads to high resource consumption, slow response times, and potential server crashes, negatively impacting other users and services.Screenshots
N/A (This is a backend issue related to query handling and resource management.)
Additional details
This issue should be resolved by implementing server-side restrictions on the
first
andskip
parameters in the GraphQL resolvers. The maximum limit forfirst
should be set to a reasonable value (e.g., 100 records), and theskip
parameter should be validated to ensure it is non-negative.Potential internship candidates
Please read this if you are planning to apply for a Palisadoes Foundation internship
The text was updated successfully, but these errors were encountered: