-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-42569][CONNECT] Throw unsupported exceptions for non-supported API #40164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| new DataFrameWriterV2[T](table, this) | ||
| } | ||
|
|
||
| def unpersist(blocking: Boolean): this.type = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you skip persist for a reason?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
persist is implemented in the Python side I believe
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I now remember:
def persist(): this.type
def persist(newLevel: StorageLevel): this.type
There is a StorageLevel class in the signature which is from the core module. Does Scala client now can reuse the core 's classes (IIUC scala client is aiming to get rid of core dependency eventually)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, we can use the storage enum. We will move the enum to common/util later on.
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Dataset.scala
Show resolved
Hide resolved
|
I am merging this one. Can you do persist in a follow-up? |
… API ### What changes were proposed in this pull request? Match https://github.com/apache/spark/blob/6a2433070e60ad02c69ae45706a49cdd0b88a082/python/pyspark/sql/connect/dataframe.py#L1500 to throw unsupported exceptions in Scala client. ### Why are the changes needed? Better indicating a API is not supported yet. ### Does this PR introduce _any_ user-facing change? NO ### How was this patch tested? N/A Closes #40164 from amaliujia/unsupported_op. Authored-by: Rui Wang <rui.wang@databricks.com> Signed-off-by: Herman van Hovell <herman@databricks.com> (cherry picked from commit 2f9e5d5) Signed-off-by: Herman van Hovell <herman@databricks.com>
…rsist ### What changes were proposed in this pull request? Follow up #40164 to also throw unsupported operation exception for `persist`. Right now we are ok to depends on the `StorageLevel` in core module but in the future that shall be refactored and moved to a common module. ### Why are the changes needed? Better way to indicate a non-supported API. ### Does this PR introduce _any_ user-facing change? NO ### How was this patch tested? N/A Closes #40172 from amaliujia/unsupported_op_2. Authored-by: Rui Wang <rui.wang@databricks.com> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
…rsist ### What changes were proposed in this pull request? Follow up #40164 to also throw unsupported operation exception for `persist`. Right now we are ok to depends on the `StorageLevel` in core module but in the future that shall be refactored and moved to a common module. ### Why are the changes needed? Better way to indicate a non-supported API. ### Does this PR introduce _any_ user-facing change? NO ### How was this patch tested? N/A Closes #40172 from amaliujia/unsupported_op_2. Authored-by: Rui Wang <rui.wang@databricks.com> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org> (cherry picked from commit 08675f2) Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
… API ### What changes were proposed in this pull request? Match https://github.com/apache/spark/blob/6a2433070e60ad02c69ae45706a49cdd0b88a082/python/pyspark/sql/connect/dataframe.py#L1500 to throw unsupported exceptions in Scala client. ### Why are the changes needed? Better indicating a API is not supported yet. ### Does this PR introduce _any_ user-facing change? NO ### How was this patch tested? N/A Closes apache#40164 from amaliujia/unsupported_op. Authored-by: Rui Wang <rui.wang@databricks.com> Signed-off-by: Herman van Hovell <herman@databricks.com> (cherry picked from commit 2f9e5d5) Signed-off-by: Herman van Hovell <herman@databricks.com>
…rsist ### What changes were proposed in this pull request? Follow up apache#40164 to also throw unsupported operation exception for `persist`. Right now we are ok to depends on the `StorageLevel` in core module but in the future that shall be refactored and moved to a common module. ### Why are the changes needed? Better way to indicate a non-supported API. ### Does this PR introduce _any_ user-facing change? NO ### How was this patch tested? N/A Closes apache#40172 from amaliujia/unsupported_op_2. Authored-by: Rui Wang <rui.wang@databricks.com> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org> (cherry picked from commit 08675f2) Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
What changes were proposed in this pull request?
Match
spark/python/pyspark/sql/connect/dataframe.py
Line 1500 in 6a24330
Why are the changes needed?
Better indicating a API is not supported yet.
Does this PR introduce any user-facing change?
NO
How was this patch tested?
N/A