-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Datastores can return large numbers runs that are processing synchronously #1858
Labels
area/backend
branch/master
The master branch
branch/stable
The stable branch
type/enhancement
An enhancement to an existing feature
Comments
johnaohara
added a commit
to johnaohara/Horreum
that referenced
this issue
Jul 3, 2024
johnaohara
added
type/enhancement
An enhancement to an existing feature
branch/master
The master branch
branch/stable
The stable branch
area/backend
labels
Jul 3, 2024
johnaohara
added a commit
to johnaohara/Horreum
that referenced
this issue
Jul 4, 2024
johnaohara
added a commit
to johnaohara/Horreum
that referenced
this issue
Jul 24, 2024
johnaohara
added a commit
to johnaohara/Horreum
that referenced
this issue
Jul 25, 2024
johnaohara
added a commit
to johnaohara/Horreum
that referenced
this issue
Sep 5, 2024
johnaohara
added a commit
that referenced
this issue
Sep 5, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
area/backend
branch/master
The master branch
branch/stable
The stable branch
type/enhancement
An enhancement to an existing feature
Describe the bug
Some backend datastore integrations can return a large number of runs.
This can cause issues with importing a lot of run data in one query, such as long transaction times, failure to process a single run data will fail the entire import etc.
The processing of run data should be performed asynchronously in these cases and runs should be offloaded to an external queue.
See #1703 for an example datastore that can return a large number of run results
The text was updated successfully, but these errors were encountered: