-
Notifications
You must be signed in to change notification settings - Fork 599
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: limit 1
batch query caused OOM
#8721
Comments
2023-03-22's longevity test (
|
@liurenjie1024 PTAL and feel free to assign to others. |
After checking recect failures, it's caused by batch query, so let's close it first. |
Recured at today's longevity test. https://buildkite.com/risingwave-test/longevity-kubebench/builds/274
|
I think the relation between crash and batch query failure is quite clear:
|
The meta cache size keeps growing: While we only allocated 300M bytes to meta cache: cc @hzxa21 |
It is inevitable because they are all in use. |
Maybe we should block some operation before allocation memory? |
@Little-Wallace has already found a solution. We can wait for his PR. |
Any update? cc @soundOfDestiny |
|
FYI, #9517 is merged. |
Let's keep this open for a while, currently we still didn't enable limit 1 in longevity test and verify it. |
Describe the bug
In terms of timing, this issue seems to be related to the final results checking stage, which runs a batch query over the result MV:
You may see this from the attached BuildKite log. Before 11:22, everything worked well; then the result check began, and we got several restarts.
I suspect the batch query caused some dramatic memory spike. Any ideas? By the way, why the “batch query’s memory usage” is always empty?
Slack thread: https://risingwave-labs.slack.com/archives/C0423G2NUF8/p1679395706998939
To Reproduce
No response
Expected behavior
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: