-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-9757] [SQL] Fixes persistence of Parquet relation with decimal column #8130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-9757] [SQL] Fixes persistence of Parquet relation with decimal column #8130
Conversation
7a9c045 to
194d7db
Compare
|
Test build #40641 has finished for PR 8130 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use our own hive client's notion of version instead of a regex here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will update it.
|
Test build #40650 has finished for PR 8130 at commit
|
|
Test build #1501 has finished for PR 8130 at commit
|
|
Test build #40660 has finished for PR 8130 at commit
|
d42a0ef to
f4e4c2c
Compare
|
Test build #1507 has finished for PR 8130 at commit
|
|
Test build 1507 is not for the latest commit. I manually triggered 1525. |
|
Test build #1525 has finished for PR 8130 at commit
|
|
(Please ignore this comment, meant to post it on #8158...) |
… column PR #7967 enables us to save data source relations to metastore in Hive compatible format when possible. But it fails to persist Parquet relations with decimal column(s) to Hive metastore of versions lower than 1.2.0. This is because `ParquetHiveSerDe` in Hive versions prior to 1.2.0 doesn't support decimal. This PR checks for this case and falls back to Spark SQL specific metastore table format. Author: Yin Huai <yhuai@databricks.com> Author: Cheng Lian <lian@databricks.com> Closes #8130 from liancheng/spark-9757/old-hive-parquet-decimal. (cherry picked from commit 6993031) Signed-off-by: Cheng Lian <lian@databricks.com>
… column PR apache#7967 enables us to save data source relations to metastore in Hive compatible format when possible. But it fails to persist Parquet relations with decimal column(s) to Hive metastore of versions lower than 1.2.0. This is because `ParquetHiveSerDe` in Hive versions prior to 1.2.0 doesn't support decimal. This PR checks for this case and falls back to Spark SQL specific metastore table format. Author: Yin Huai <yhuai@databricks.com> Author: Cheng Lian <lian@databricks.com> Closes apache#8130 from liancheng/spark-9757/old-hive-parquet-decimal.
PR #7967 enables us to save data source relations to metastore in Hive compatible format when possible. But it fails to persist Parquet relations with decimal column(s) to Hive metastore of versions lower than 1.2.0. This is because
ParquetHiveSerDein Hive versions prior to 1.2.0 doesn't support decimal. This PR checks for this case and falls back to Spark SQL specific metastore table format.