-
Notifications
You must be signed in to change notification settings - Fork 29k
SPARK-9210 corrects aggregate function name in exception message #7557
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
|
CC @marmbrus just in case |
|
I don't think this is correct: https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala#L147 scala> sql("SELECT a FROM test GROUP BY b")
org.apache.spark.sql.AnalysisException: expression 'a' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() if you don't care which value you get.;
scala> sql("SELECT first(a) FROM test GROUP BY b")
res3: org.apache.spark.sql.DataFrame = [_c0: int] |
|
@marmbrus can you please provide a complete example that can execute in You can find a standalone runnable example with complete shell output in this gist. Here is the summary of what happens: // ERROR RetryingHMSHandler: MetaException(message:NoSuchObjectException(message:Function default.first does not exist))
// INFO FunctionRegistry: Unable to lookup UDF in metastore: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:NoSuchObjectException(message:Function default.first does not exist))
// java.lang.RuntimeException: Couldn't find function first
ctx.sql("select first(num) from test_first group by category").show
// OK
ctx.sql("select first_value(num) from test_first group by category").showPerhaps the difference is that I'm using |
|
Which version of hive/spark are you running? |
|
@marmbrus you can see the version and full INFO-level shell output in the gist. I'm running 1.4.1. |
|
cc @yhuai since you are working on a related issue. |
|
In Spark 1.4, |
|
@Yhual great |
|
Hey @ssimeonov, would you mind closing this PR now that it's change has been incorporated into #8113? Thanks! |
JIRA issue