-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow using UnityCatalogTable
in DataFrame.write_deltalake
#3336
Comments
UnityCatalogTable
in DataFrame.read_deltalake
UnityCatalogTable
in DataFrame.write_deltalake
Hey @kevinzwang , I am planning to work on this issue over the next couple of days and believe it can be implemented using at least a couple of API choices. I would like to get your opinion from Eventual's perspective on which approach might be better. Option A:
Option B:
Interested to hear your thoughts. |
Hi @anilmenon14, thanks for offering to take this on! I was thinking the API would look something like this: from daft.unity_catalog import UnityCatalog
unity = UnityCatalog(endpoint=DATABRICKS_HOST, token=PAT_TOKEN)
table = unity.load_table("tbl_name")
df.write_deltalake(table, mode="overwrite") |
Thanks for the direction @kevinzwang . This indeed is intuitive and I had not looked into using |
Great! Looking forward to the PR |
I have just logged a PR for Daft support for Unity catalog table writes. |
Thank you @anilmenon14, will be sure to give it a review soon! |
Is your feature request related to a problem?
You can create a Unity Catalog table in Daft using
daft.unity_catalog.UnityCatalog.load_table(tbl)
. At the moment, you can only use that table withdaft.read_deltalake
.Describe the solution you'd like
We should also be able to similarly use it for
DataFrame.write_deltalake
.Describe alternatives you've considered
We can extract the table URI and io config from the unity catalog table and pass those into
write_deltalake
manually, but that is not preferred.Additional Context
No response
Would you like to implement a fix?
No
The text was updated successfully, but these errors were encountered: