-
Notifications
You must be signed in to change notification settings - Fork 2k
allow user-specified schema in read if it's consistent #3929
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@allisonport-db @cloud-fan will this very important fix be in the 3.3.0 release? |
|
@allisonport-db @cloud-fan will this very important fix be in the 3.3.0 release? |
|
cc @tdas |
|
Hi, @tdas any news about this fix? Thanks! |
|
Hi, |
|
@cloud-fan @tdas any update? Anything? |
Hi @nimrod-doubleverify apologies for the delay here, unfortunately this didn't make it to the 3.3.0 release but we'll work on getting out a 3.3.1 release this week. |
spark/src/main/scala/org/apache/spark/sql/delta/sources/DeltaDataSource.scala
Outdated
Show resolved
Hide resolved
spark/src/main/scala/org/apache/spark/sql/delta/sources/DeltaDataSource.scala
Outdated
Show resolved
Hide resolved
spark/src/test/scala/org/apache/spark/sql/delta/DeltaSourceSuite.scala
Outdated
Show resolved
Hide resolved
spark/src/test/scala/org/apache/spark/sql/delta/DeltaSourceSuite.scala
Outdated
Show resolved
Hide resolved
|
Thank you for debugging and fixing this @cloud-fan |
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> User-specified schema may come from the catalog if the Delta table is stored in an external catalog that syncs the table schema with the Delta log. We should allow it if it's the same as the real Delta table schema. This is already the case for batch read, see apache/spark#15046 This PR changes the Delta streaming read to allow it as well. Note: since Delta uses DS v2 (`TableProvider`) and explicitly claims that user-specified schema is not supported (`TableProvider#supportsExternalMetadata` returns false by default), end users still can't specify schema in `spark.read/readStream.schema`. This change is only for advanced Spark plugins that can construct logical plans to triggers Delta v1 source stream scan. ## How was this patch tested? <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> a new test ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No
…4125) backport #3929 to 3.3 <!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> User-specified schema may come from the catalog if the Delta table is stored in an external catalog that syncs the table schema with the Delta log. We should allow it if it's the same as the real Delta table schema. This is already the case for batch read, see apache/spark#15046 This PR changes the Delta streaming read to allow it as well. Note: since Delta uses DS v2 (`TableProvider`) and explicitly claims that user-specified schema is not supported (`TableProvider#supportsExternalMetadata` returns false by default), end users still can't specify schema in `spark.read/readStream.schema`. This change is only for advanced Spark plugins that can construct logical plans to triggers Delta v1 source stream scan. ## How was this patch tested? <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> a new test ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [X] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> This is a follow up on #3929 When the stream is created it should be allowed to provide schema. This feature is desired in Unity Catalog OSS, which is providing the schema (from the catalog) before when the source is created. This is causing the problem described in unitycatalog/unitycatalog#715 I've also changed the sql error code and the actual error. I'm not sure the error code is the best one but it is matching similar schema conflict errors from Spark core. ## How was this patch tested? Delta integration tests and the integration test for streaming I've written in Unity Catalog (not published yet). <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No Signed-off-by: Artur Owczarek <owczarek.artur@gmail.com>
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [X] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> This is a follow up on delta-io#3929 When the stream is created it should be allowed to provide schema. This feature is desired in Unity Catalog OSS, which is providing the schema (from the catalog) before when the source is created. This is causing the problem described in unitycatalog/unitycatalog#715 I've also changed the sql error code and the actual error. I'm not sure the error code is the best one but it is matching similar schema conflict errors from Spark core. ## How was this patch tested? Delta integration tests and the integration test for streaming I've written in Unity Catalog (not published yet). <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No Signed-off-by: Artur Owczarek <owczarek.artur@gmail.com>
Which Delta project/connector is this regarding?
Description
User-specified schema may come from the catalog if the Delta table is stored in an external catalog that syncs the table schema with the Delta log. We should allow it if it's the same as the real Delta table schema.
This is already the case for batch read, see apache/spark#15046
This PR changes the Delta streaming read to allow it as well.
Note: since Delta uses DS v2 (
TableProvider) and explicitly claims that user-specified schema is not supported (TableProvider#supportsExternalMetadatareturns false by default), end users still can't specify schema inspark.read/readStream.schema. This change is only for advanced Spark plugins that can construct logical plans to triggers Delta v1 source stream scan.How was this patch tested?
a new test
Does this PR introduce any user-facing changes?
No