-
Notifications
You must be signed in to change notification settings - Fork 235
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] java.lang.ArithmeticException: divide by zero when spark.sql.ansi.enabled=true #2078
Comments
@viadea I could not reproduce this locally with SF=100 data set. Which data set were you running against? |
It is possible that due to rounding differences that we get a zero on GPU but not on CPU which could explain the difference here. I can explore more once I know which data set to test with. It would also be good to know whether decimal support was enabled or not. |
@andygrove i will try to see if i can find a minimum repro for this one. Once I found it i will update here. |
@andygrove I narrowed down Q7 to a minimum reproduce.
And then the result table
Then I can reproduce the issue using below SQL:
Or spark-shell version:
After removing |
Thanks for the repro case @viadea. I have been debugging this with @abellina and we discovered that the issue is that the Spark |
Describe the bug
A clear and concise description of what the bug is.
When
spark.sql.ansi.enabled=true
, NDS query Q7 will fail with below stacktrace:Steps/Code to reproduce bug
Please provide a list of steps or a code sample to reproduce the issue.
Avoid posting private or sensitive data.
Turn on
spark.sql.ansi.enabled=true
and run NDS Q7.Expected behavior
A clear and concise description of what you expected to happen.
NDS Q7 should run fine when
spark.sql.ansi.enabled=true
in GPU Mode.Environment details (please complete the following information)
Standalone Spark
3.1.1
single-node cluster.rapids accelerator
0.4.1
GA release.Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: