-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-41600][SPARK-41623][SPARK-41612][CONNECT] Implement Catalog.cacheTable, isCached and uncache #39919
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-41600][SPARK-41623][SPARK-41612][CONNECT] Implement Catalog.cacheTable, isCached and uncache #39919
Conversation
|
cc @zhengruifeng, @grundprinzip @ueshin FYI |
be8f9c9 to
d9de8a0
Compare
d9de8a0 to
a9d85f0
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should change back once #39882 is merged.
a9d85f0 to
7f1f45c
Compare
7f1f45c to
bae11a1
Compare
| Throw an analysis exception when the table does not exist. | ||
| >>> spark.catalog.isCached("not_existing_table") | ||
| >>> spark.catalog.isCached("not_existing_table") # doctest: +SKIP |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should enable this back too once #39882 is merged.
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM (Pending CIs)
|
Merged to master and branch-3.4. |
|
Thank you guys. |
…cheTable, isCached and uncache ### What changes were proposed in this pull request? This PR adds three API below to Spark Connect - `Catalog.isCached` - `Catalog.cacheTable` - `Catalog uncacheTable` ### Why are the changes needed? These were not added because of the design concern (in its behaviour). However, we should provide the same API compatibility and behaivours with the regular PySpark in any event. So these are proposed back. ### Does this PR introduce _any_ user-facing change? No to end users. Yes to the dev because it adds three new API in Spark Connect. ### How was this patch tested? Unittests were added. Closes #39919 from HyukjinKwon/SPARK-41600-SPARK-41623-SPARK-41612. Authored-by: Hyukjin Kwon <gurwls223@apache.org> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org> (cherry picked from commit 54b5cf6) Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
…cheTable, isCached and uncache ### What changes were proposed in this pull request? This PR adds three API below to Spark Connect - `Catalog.isCached` - `Catalog.cacheTable` - `Catalog uncacheTable` ### Why are the changes needed? These were not added because of the design concern (in its behaviour). However, we should provide the same API compatibility and behaivours with the regular PySpark in any event. So these are proposed back. ### Does this PR introduce _any_ user-facing change? No to end users. Yes to the dev because it adds three new API in Spark Connect. ### How was this patch tested? Unittests were added. Closes apache#39919 from HyukjinKwon/SPARK-41600-SPARK-41623-SPARK-41612. Authored-by: Hyukjin Kwon <gurwls223@apache.org> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org> (cherry picked from commit 54b5cf6) Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
What changes were proposed in this pull request?
This PR adds three API below to Spark Connect
Catalog.isCachedCatalog.cacheTableCatalog uncacheTableWhy are the changes needed?
These were not added because of the design concern (in its behaviour). However, we should provide the same API compatibility and behaivours with the regular PySpark in any event. So these are proposed back.
Does this PR introduce any user-facing change?
No to end users.
Yes to the dev because it adds three new API in Spark Connect.
How was this patch tested?
Unittests were added.