-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-20680][SQL] Spark-sql do not support for void column datatype … #17953
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -1504,6 +1504,7 @@ class AstBuilder extends SqlBaseBaseVisitor[AnyRef] with Logging { | |
| case ("decimal", precision :: Nil) => DecimalType(precision.getText.toInt, 0) | ||
| case ("decimal", precision :: scale :: Nil) => | ||
| DecimalType(precision.getText.toInt, scale.getText.toInt) | ||
| case ("void", Nil) => NullType | ||
|
||
| case (dt, params) => | ||
| val dtStr = if (params.nonEmpty) s"$dt(${params.mkString(",")})" else dt | ||
| throw new ParseException(s"DataType $dtStr is not supported.", ctx) | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -1928,4 +1928,17 @@ class HiveDDLSuite | |
| } | ||
| } | ||
| } | ||
|
|
||
| test("SPARK-20680: Spark-sql do not support for void column datatype of view") { | ||
| withTable("t", "tabNullType") { | ||
| val client = spark.sharedState.externalCatalog.asInstanceOf[HiveExternalCatalog].client | ||
| client.runSqlHive("CREATE TABLE t (t1 int)") | ||
| client.runSqlHive("INSERT INTO t VALUES (3)") | ||
| client.runSqlHive("CREATE TABLE tabNullType AS SELECT NULL AS col FROM t") | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. IIRC, hive 2 does't support this. Let's test with |
||
| checkAnswer(spark.table("tabNullType"), Row(null)) | ||
| // table description shows "void" representation for NULL type. | ||
| val desc = spark.sql("DESC tabNullType").collect().toSeq | ||
| assert(desc.contains(Row("col", "void", null))) | ||
| } | ||
| } | ||
| } | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change really resolves your issue?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apparently Hive can have null typed columns. So this should be the location where you'd want to change this.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hive 2.x disables it. Could you add some test cases by reading and writing the tables with void types? Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 for the test case.