Skip to content

Conversation

@cloud-fan
Copy link
Contributor

@cloud-fan cloud-fan commented Dec 6, 2024

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Description

User-specified schema may come from the catalog if the Delta table is stored in an external catalog that syncs the table schema with the Delta log. We should allow it if it's the same as the real Delta table schema.

This is already the case for batch read, see apache/spark#15046

This PR changes the Delta streaming read to allow it as well.

Note: since Delta uses DS v2 (TableProvider) and explicitly claims that user-specified schema is not supported (TableProvider#supportsExternalMetadata returns false by default), end users still can't specify schema in spark.read/readStream.schema. This change is only for advanced Spark plugins that can construct logical plans to triggers Delta v1 source stream scan.

How was this patch tested?

a new test

Does this PR introduce any user-facing changes?

No

@ni-mi
Copy link

ni-mi commented Dec 10, 2024

@allisonport-db @cloud-fan will this very important fix be in the 3.3.0 release?

@ni-mi
Copy link

ni-mi commented Dec 24, 2024

@allisonport-db @cloud-fan will this very important fix be in the 3.3.0 release?

@cloud-fan
Copy link
Contributor Author

cc @tdas

@ni-mi
Copy link

ni-mi commented Jan 2, 2025

Hi,

@tdas any news about this fix?
It's a blocker for using Delta tables with Structured Streaming using Open Accessibility Databricks - and UC...

Thanks!

@carl-olin
Copy link

Hi,
I think I've experienced issues related to this when streaming from tables Glue catalogs as well. If the schema is properly set in the table property spark.sql.sources.schema, any streaming reads will fail. From the look of things, this is handled by setting the schema to {"type":"struct","fields":[]}. The Glue schema is subsequently set to col: array<byte>, making the table break tools like Athena.

@nimrod-doubleverify
Copy link

@cloud-fan @tdas any update? Anything?

@raveeram-db
Copy link
Collaborator

raveeram-db commented Feb 3, 2025

@cloud-fan @tdas any update? Anything?

Hi @nimrod-doubleverify apologies for the delay here, unfortunately this didn't make it to the 3.3.0 release but we'll work on getting out a 3.3.1 release this week.

@tdas tdas merged commit 3873570 into delta-io:master Feb 5, 2025
16 of 19 checks passed
@tdas
Copy link
Contributor

tdas commented Feb 5, 2025

Thank you for debugging and fixing this @cloud-fan

cloud-fan added a commit to cloud-fan/delta that referenced this pull request Feb 6, 2025
<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description

<!--
- Describe what this PR changes.
- Describe why we need the change.
 
If this PR resolves an issue be sure to include "Resolves #XXX" to
correctly link and close the issue upon merge.
-->

User-specified schema may come from the catalog if the Delta table is
stored in an external catalog that syncs the table schema with the Delta
log. We should allow it if it's the same as the real Delta table schema.

This is already the case for batch read, see
apache/spark#15046

This PR changes the Delta streaming read to allow it as well.

Note: since Delta uses DS v2 (`TableProvider`) and explicitly claims
that user-specified schema is not supported
(`TableProvider#supportsExternalMetadata` returns false by default), end
users still can't specify schema in `spark.read/readStream.schema`. This
change is only for advanced Spark plugins that can construct logical
plans to triggers Delta v1 source stream scan.

## How was this patch tested?

<!--
If tests were added, say they were added here. Please make sure to test
the changes thoroughly including negative and positive cases if
possible.
If the changes were tested in any way other than unit tests, please
clarify how you tested step by step (ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future).
If the changes were not tested, please explain why.
-->
a new test
## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No
tdas pushed a commit that referenced this pull request Feb 6, 2025
…4125)

backport #3929 to 3.3

<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description

<!--
- Describe what this PR changes.
- Describe why we need the change.
 
If this PR resolves an issue be sure to include "Resolves #XXX" to
correctly link and close the issue upon merge.
-->

User-specified schema may come from the catalog if the Delta table is
stored in an external catalog that syncs the table schema with the Delta
log. We should allow it if it's the same as the real Delta table schema.

This is already the case for batch read, see
apache/spark#15046

This PR changes the Delta streaming read to allow it as well.

Note: since Delta uses DS v2 (`TableProvider`) and explicitly claims
that user-specified schema is not supported
(`TableProvider#supportsExternalMetadata` returns false by default), end
users still can't specify schema in `spark.read/readStream.schema`. This
change is only for advanced Spark plugins that can construct logical
plans to triggers Delta v1 source stream scan.

## How was this patch tested?

<!--
If tests were added, say they were added here. Please make sure to test
the changes thoroughly including negative and positive cases if
possible.
If the changes were tested in any way other than unit tests, please
clarify how you tested step by step (ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future).
If the changes were not tested, please explain why.
-->
a new test
## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No
tdas pushed a commit that referenced this pull request Nov 6, 2025
<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [X] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description

<!--
- Describe what this PR changes.
- Describe why we need the change.
 
If this PR resolves an issue be sure to include "Resolves #XXX" to
correctly link and close the issue upon merge.
-->

This is a follow up on #3929

When the stream is created it should be allowed to provide schema.

This feature is desired in Unity Catalog OSS, which is providing the
schema (from the catalog) before when the source is created. This is
causing the problem described in
unitycatalog/unitycatalog#715

I've also changed the sql error code and the actual error. I'm not sure
the error code is the best one but it is matching similar schema
conflict errors from Spark core.

## How was this patch tested?

Delta integration tests and the integration test for streaming I've
written in Unity Catalog (not published yet).

<!--
If tests were added, say they were added here. Please make sure to test
the changes thoroughly including negative and positive cases if
possible.
If the changes were tested in any way other than unit tests, please
clarify how you tested step by step (ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future).
If the changes were not tested, please explain why.
-->

## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No

Signed-off-by: Artur Owczarek <owczarek.artur@gmail.com>
01001101CK pushed a commit to 01001101CK/delta that referenced this pull request Nov 17, 2025
<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [X] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description

<!--
- Describe what this PR changes.
- Describe why we need the change.
 
If this PR resolves an issue be sure to include "Resolves #XXX" to
correctly link and close the issue upon merge.
-->

This is a follow up on delta-io#3929

When the stream is created it should be allowed to provide schema.

This feature is desired in Unity Catalog OSS, which is providing the
schema (from the catalog) before when the source is created. This is
causing the problem described in
unitycatalog/unitycatalog#715

I've also changed the sql error code and the actual error. I'm not sure
the error code is the best one but it is matching similar schema
conflict errors from Spark core.

## How was this patch tested?

Delta integration tests and the integration test for streaming I've
written in Unity Catalog (not published yet).

<!--
If tests were added, say they were added here. Please make sure to test
the changes thoroughly including negative and positive cases if
possible.
If the changes were tested in any way other than unit tests, please
clarify how you tested step by step (ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future).
If the changes were not tested, please explain why.
-->

## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No

Signed-off-by: Artur Owczarek <owczarek.artur@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants