-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-42328][SQL] Remove _LEGACY_ERROR_TEMP_1175 from error classes #45183
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-42328][SQL] Remove _LEGACY_ERROR_TEMP_1175 from error classes #45183
Conversation
| errorClass = "_LEGACY_ERROR_TEMP_1175", | ||
| messageParameters = Map("dataType" -> field.dataType.catalogString)) | ||
| errorClass = "UNSUPPORTED_DATATYPE", | ||
| messageParameters = Map("typeName" -> field.dataType.catalogString)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you quote the type using toSQLType() (the function uses slightly different converter of Catalyst type to strings. That's more consistent to other errors).
| } | ||
| checkError( | ||
| exception = intercept[AnalysisException] { | ||
| converter.convertField(StructField("test", dummyDataType)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to reproduce the error using public API? If not, we should think of converting to an internal error.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@MaxGekk Replaced it with INTERNAL_ERROR, please check the updated PR description.
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
Outdated
Show resolved
Hide resolved
…ompilationErrors.scala Co-authored-by: Maxim Gekk <max.gekk@gmail.com>
|
+1, LGTM. Merging to master. |
|
@nikolamand-db Congratulations with your first contribution to Apache Spark! |
### What changes were proposed in this pull request? Only occurrence of `_LEGACY_ERROR_TEMP_1175` appears under conversion from Spark data types to Parquet. All supported documented [Spark data types](https://spark.apache.org/docs/latest/sql-ref-datatypes.html) are covered in the [conversion function](https://github.com/apache/spark/blob/3e0808c33f185c13808ce2d547ce9ba0057d31a6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala#L517-L745) (`VarcharType` and `CharType` are not present but they are passed down as string before reaching the conversion function), so under normal circumstances user can't force this error. Convert the error class to `INTERNAL_ERROR`. ### Why are the changes needed? Remove legacy error classes as part of activity in [SPARK-37935](https://issues.apache.org/jira/browse/SPARK-37935). ### Does this PR introduce _any_ user-facing change? If the Spark works correctly, user shouldn't be able to run into `INTERNAL_ERROR` by using the public API. ### How was this patch tested? Added test to `QueryCompilationErrorsSuite` and tested with sbt: ``` project sql testOnly *QueryCompilationErrorsSuite ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#45183 from nikolamand-db/nikolamand-db/SPARK-42328. Authored-by: Nikola Mandic <nikola.mandic@databricks.com> Signed-off-by: Max Gekk <max.gekk@gmail.com>
What changes were proposed in this pull request?
Only occurrence of
_LEGACY_ERROR_TEMP_1175appears under conversion from Spark data types to Parquet. All supported documented Spark data types are covered in the conversion function (VarcharTypeandCharTypeare not present but they are passed down as string before reaching the conversion function), so under normal circumstances user can't force this error.Convert the error class to
INTERNAL_ERROR.Why are the changes needed?
Remove legacy error classes as part of activity in SPARK-37935.
Does this PR introduce any user-facing change?
If the Spark works correctly, user shouldn't be able to run into
INTERNAL_ERRORby using the public API.How was this patch tested?
Added test to
QueryCompilationErrorsSuiteand tested with sbt:Was this patch authored or co-authored using generative AI tooling?
No.