-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-22303][SQL] Handle Oracle specific jdbc types in OracleDialect #19548
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
TIMESTAMP (-101), BINARY_DOUBLE (101) and BINARY_FLOAT (100) are handled in OracleDialect
|
ok to test |
|
Could you add a test case to Below is the instruction how to run the docker-based testsuite
|
| } | ||
| case -101 => Some(TimestampType) // Value for Timestamp with Time Zone in Oracle | ||
| case 100 => Some(FloatType) // Value for OracleTypes.BINARY_FLOAT | ||
| case 101 => Some(DoubleType) // Value for OracleTypes.BINARY_DOUBLE |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Question: Is the value range of OracleTypes.BINARY_DOUBLE identical to our Spark Double type?
Also, OracleTypes.BINARY_FLOAT with Spark Float type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should match java's double / float definition
|
Test build #82951 has finished for PR 19548 at commit
|
|
Okay new integration test has been added According to their document
This should match java's double / float definition |
srowen
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are other DB-specific types supported in Spark?
| val props = new Properties() | ||
|
|
||
| // write it back to the table (append mode) | ||
| val data = spark.sparkContext.parallelize(Seq(Row(1.1D, 2.2F))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: I think it's more conventional to write "1.1" and "2.2f"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
| conn.commit(); | ||
|
|
||
|
|
||
| conn.prepareStatement("CREATE TABLE oracle_types (d BINARY_DOUBLE, f BINARY_FLOAT)").executeUpdate(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No semicolon at the end of lines
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed trailing semi-colons throughout this file
| class OracleIntegrationSuite extends DockerJDBCIntegrationSuite with SharedSQLContext { | ||
| import testImplicits._ | ||
| // To make === between double tolerate inexact values | ||
| implicit val doubleEquality = TolerantNumerics.tolerantDoubleEquality(0.01) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Curious why it would be that inequal somewhere in these tests?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed this line and the test still passes
| Some(DecimalType(DecimalType.MAX_PRECISION, 10))) | ||
| assert(oracleDialect.getCatalystType(java.sql.Types.NUMERIC, "numeric", 0, null) == | ||
| Some(DecimalType(DecimalType.MAX_PRECISION, 10))) | ||
| assert(oracleDialect.getCatalystType(100, "BINARY_FLOAT", 0, null) == |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's define constants somewhere suitable for 100/100/-101
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Made them constants defined in OracleDialect
|
Test build #82965 has finished for PR 19548 at commit
|
- No trailing semi-colons - constatns for 100, 101 and -101 - just compare double / float
|
Test build #82968 has finished for PR 19548 at commit
|
|
LGTM |
|
Thanks! Merged to master. |
TIMESTAMP (-101), BINARY_DOUBLE (101) and BINARY_FLOAT (100) are handled in OracleDialect
What changes were proposed in this pull request?
When a oracle table contains columns whose type is BINARY_FLOAT or BINARY_DOUBLE, spark sql fails to load a table with SQLException
How was this patch tested?
I updated a UT which covers type conversion test for types (-101, 100, 101), on top of that I tested this change against actual table with those columns and it was able to read and write to the table.