-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-9378][SQL][HotFix] Remove improper and failed test that checks schema stored by Hive #7693
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #38535 has finished for PR 7693 at commit
|
|
cc @liancheng |
|
I saw this |
|
Not very sure. I suppose that it is related to removing old |
|
I just thought that before old seems to be saved as a Now it goes another path in |
|
Reproduced this test failure locally. A proper fix for this issue can be: withSQLConf(HiveContext.CONVERT_METASTORE_PARQUET.key -> "false") {
checkExistence(sql("DESC EXTENDED ctas5"), true,
"name:key", "type:string", "name:value", "ctas5",
"org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat",
"org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat",
"org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe",
"MANAGED_TABLE"
)
}The reason is that, the @viirya I'm going to open a new PR to fix this issue since it is probably breaking the PR builder. Will attribute the new PR to you. Thanks for bringing up and investigating this issue! |
|
@liancheng Thanks. |
|
However, what makes me confused is that, the following PySpark snippet shows the same result under Spark 1.4: Anyway, I'm fixing this now. Need further investigation why it just started failing. |
This is a proper version of PR #7693 authored by viirya The reason why "CTAS with serde" fails is that the `MetastoreRelation` gets converted to a Parquet data source relation by default. Author: Cheng Lian <lian@databricks.com> Closes #7700 from liancheng/spark-9378-fix-ctas-test and squashes the following commits: 4413af0 [Cheng Lian] Fixes test case "CTAS with serde"
JIRA: https://issues.apache.org/jira/browse/SPARK-9378
As old
ParquetRelationis completely removed from codebase andParquetRelation2becomesParquetRelation, one test inorg.apache.spark.sql.hive.execution.SQLQuerySuitethat checks schema stored by Hive will fail as observed in #7520's recent test report. We should remove the test now.