Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Spark] Type Widening preview #2937

Merged
merged 2 commits into from
Apr 24, 2024
Merged

Conversation

johanl-db
Copy link
Collaborator

@johanl-db johanl-db commented Apr 22, 2024

Description

Expose the type widening table feature outside of testing and set its preview user-facing name: typeWidening-preview (instead of typeWidening-dev used until now).

Feature description: #2622
The type changes that are supported for not are byte -> short -> int. Other types depend on Spark changes which are going to land in Spark 4.0 and will be available once Delta picks up that Spark version.

How was this patch tested?

Extensive testing in DeltaTypeWidening*Suite.

Does this PR introduce any user-facing changes?

User facing changes were already covered in PRs implementing this feature. In short, it allows:

  • Adding the type widening table feature (using a table property)
ALTER TABLE t SET TBLPROPERTIES (‘delta.enableTypeWidening = true);
  • Manual type changes:
ALTER TABLE t CHANGE COLUMN col TYPE INT;
  • Automatic type changes via schema evolution:
CREATE TABLE target (id int, value short);
CREATE TABLE source (id int, value in);
SET spark.databricks.delta.schema.autoMerge.enabled = true;
INSERT INTO target SELECT * FROM source;
-- value now has type int in target
  • Dropping the table feature which rewrites data to make the table reading by all readers:
ALTER TABLE t DROP FEATURE 'typeWidening'

@tdas tdas merged commit 5ace827 into delta-io:master Apr 24, 2024
7 of 8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants