-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
REPLACE COLUMNS unsupported? #702
Comments
Delta Lake does not support deleting a column. This is an opinionated approach we have taken. We believe that deleting and renaming columns in tables lead to a lot of downstream confusion, and it's easy for folks to shoot themselves in the foot with it - incorrect results, data loss, etc. Hence we do not support it as of now. |
Ah, ok. Thanks @tdas for clarifying. I misunderstood the ‘REPLACE COLUMNS’ example in the docs. I thought it was deleting ‘colA’ but it was actually just reordering it. |
Apologies if I'm missing something basic, bit I'm reopening this because I still haven't gotten I tried creating a unit test that uses
The test keeps throwing:
It seems like Does anyone have a concrete example of |
Wouldn't what TD said earlier explain the behaviour in 1.0.0?
|
Hey @jaceklaskowski , this example isn't deleting |
I see the following in the code:
My understanding is that the single-column What's very interesting though is that |
Found it! See this comment in Spark SQL itself (before Delta Lake can do anything to alter / augment the behaviour):
|
Quick note, we currently have issue #732 to support column drop and rename. Closing this issue for now - thanks! |
The Delta Lake 1.0.0 docs contain an example for replacing columns using
ALTER TABLE table_name REPLACE COLUMNS...
. However, when I try to run this, I'm getting an exception fromDeltaCatalog
.I took a look at
DeltaCatalog
and confirmed thatalterTable()
doesn't handleDeleteColumn
.Is this actually a supported scenario? I took a look through the unit tests and there seem to be no tests covering this.
The text was updated successfully, but these errors were encountered: