Skip to content

Conversation

@yaooqinn
Copy link
Member

What changes were proposed in this pull request?

When storing decimal(p, s) to derby, if p > 31, s is wrongly hardcoded to 5 which is the assumed default scale of derby decimal. Actually, 0 is the default scale, 5 is the default precision https://db.apache.org/derby/docs/10.13/ref/rrefsqlj15260.html

This PR calculates a suitable scale to make room for precision.

Why are the changes needed?

avoid precision loss

Does this PR introduce any user-facing change?

Yes, but derby is rare in production environments, and the new mapping are compatible for most usecases

How was this patch tested?

new tests

Was this patch authored or co-authored using generative AI tooling?

no

@github-actions github-actions bot added the SQL label May 28, 2024
@yaooqinn
Copy link
Member Author

cc @dongjoon-hyun @cloud-fan thanks

Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@yaooqinn yaooqinn deleted the SPARK-48439 branch May 30, 2024 09:32
@LuciferYang
Copy link
Contributor

Merged into master for Spark 4.0. Thanks @yaooqinn

@LuciferYang LuciferYang changed the title [SPARK-47361][SQL] Derby: Calculate suitable precision and scale for DECIMAL type [SPARK-48439][SQL] Derby: Calculate suitable precision and scale for DECIMAL type May 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants