-
Notifications
You must be signed in to change notification settings - Fork 164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement Common Subexpression Elimination optimizer rule #942
Comments
Good finding. I think this kind of optimization should be in Spark optimizer instead. |
I remember Spark SQL has corresponding optimization rule. But not sure why it doesn't affect the query. |
Related to this, it would be nice if we could improve the metrics for CometHashAggregate to show the time for evaluating the aggregate input expressions. I am not sure how much work that would be though. |
Sounds good. It should be added into DataFusion hash aggregate operator. |
It would make sense for Spark to add this, but I think that it could also be beneficial for DataFusion to support this as a physical optimizer rule. I filed apache/datafusion#12599 |
Spark has it, but not at the plan level. Instead they do it as part of their code generation: https://github.com/apache/spark/blob/v3.5.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/codegen/CodeGenerator.scala#L1064-L1098C1 |
There is now a DataFusion PR to add this feature: apache/datafusion#13046 |
What is the problem the feature request solves?
When running TPC-H q1 in Spark/Comet, the expression
l_extendedprice#21 * (1 - l_discount#22)
appears twice in the query and currently gets evaluated twice. This could be optimized out so that it is only evaluated once. I was able to test this by manually rewriting the query.Original Query
Optimized Query
Timings (Original)
Timings (Optimized)
Spark UI (Original)
Spark UI (Optimized)
Describe the potential solution
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: