Skip to content

Conversation

@amanomer
Copy link
Contributor

@amanomer amanomer commented Nov 15, 2019

What changes were proposed in this pull request?

To make SparkSQL's cast as long behavior consistent with PostgreSQL when spark.sql.dialect is configured as PostgreSQL.

Why are the changes needed?

SparkSQL and PostgreSQL have a lot different cast behavior between types by default. We should make SparkSQL's cast behavior be consistent with PostgreSQL when spark.sql.dialect is configured as PostgreSQL.

Does this PR introduce any user-facing change?

Yes. If user switches to PostgreSQL dialect now, they will

  • get an AnalysisException if they try to cast to long from TimestampType and DateType.

How was this patch tested?

Test cases to be added

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@dongjoon-hyun
Copy link
Member

Thank you for contribution, @amanomer . But, unfortunately, we decided to remove PostgreSQL dialect via SPARK-30125 (#26763). Sorry about that. I'll close this PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants