Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rearrange imports in databricks.sdk.runtime to improve local editor experience #219

Merged
merged 1 commit into from
Jul 17, 2023

Conversation

judahrand
Copy link
Contributor

@judahrand judahrand commented Jul 7, 2023

Changes

The type hinting here means that VSCode does not give useful syntax highlights / code completions.

This is the current experience on main:
image
image

With these changes this becomes:
image
image

Tests

  • make test run locally
  • make fmt applied
  • relevant integration tests applied

@judahrand judahrand force-pushed the ide-typing branch 2 times, most recently from 71769c5 to 66c2fe9 Compare July 7, 2023 10:17
@nfx
Copy link
Contributor

nfx commented Jul 10, 2023

@MrBago @kartikgupta-db could you take a look if it doesn't break anything

Copy link
Contributor

@kartikgupta-db kartikgupta-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for the OSS part.

@nfx
Copy link
Contributor

nfx commented Jul 10, 2023

@MrBago @xiaochen-db need review/approval from DBR part

except (ImportError, NameError):
from databricks.sdk.dbutils import RemoteDbUtils

# this assumes that all environment variables are set
dbutils = RemoteDbUtils()
dbutils_type = RemoteDbUtils

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you enlighten me if this following branch

    try:
        from . import stub
        from .stub import *
        dbutils_type = Type[stub.dbutils]

doesn't throw any exception, then what is dbutils? From stub.py it's a class, then why are you casting dbutils (class) to dbutils_type (Type)?

Copy link
Contributor Author

@judahrand judahrand Jul 12, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've defined dbutils_type in two places and so it ends up being a Union of the class from the stub and the RemoteDBUtils class. We then cast to this Union.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we really want the union type? It seems like the "RemoteDbutils" type doesn't have useful info for the user, can we just suppress it?

@nfx nfx changed the title Rearrange imports to improve local editor experience Rearrange imports in databricks.sdk.runtime to improve local editor experience Jul 17, 2023
@nfx nfx merged commit 6f93548 into databricks:main Jul 17, 2023
MichaelSpece pushed a commit to MichaelSpece/databricks-sdk-py that referenced this pull request Jul 17, 2023
… experience (databricks#219)

## Changes
<!-- Summary of your changes that are easy to understand -->

The type hinting here means that VSCode does not give useful syntax
highlights / code completions.

This is the current experience on `main`:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e">
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9">

With these changes this becomes:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80">
<img width="818" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49">

## Tests
<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
MichaelSpece pushed a commit to MichaelSpece/databricks-sdk-py that referenced this pull request Jul 17, 2023
author Kartik Gupta <88345179+kartikgupta-db@users.noreply.github.com> 1688609182 +0200
committer Spece <Michael.Spece@Chubb.Com> 1689629188 -0400
gpgsig -----BEGIN PGP SIGNATURE-----

 iQGzBAABCAAdFiEEmFkdmlNpeHaohb9psG3XEklX7TkFAmS1sgQACgkQsG3XEklX
 7Tl6Cgv9GDR+CQVJBr7SrITDS41HgQLIN/s/o3o/gX7htrYC9LTW95hKSUXmirMO
 6+ZfI/rJpgB4nwe7fW47mn+nkXS+RrTO+LVZ3mzhiOunNGyGNix8BpjusjSmIxmB
 kCfgNLjQ9kLVLN1MRrYD4G4eg5zIkjgdNGMpZ1zxOcxue7Vd+qd5nGusUAiBEZIN
 ok5bw/VrfEBhYFx8S+XS3+OQpXAECc/tO4jmPghq5LIaP3SnlqS3/8MrYZOAP4kT
 zvmT+l/ySfAtl49I3tV5gDzB1myu717l2rzyE9JovDg4JIpdB+F3ZkYupgKozw8+
 RhdM+4NH+0w+A4NhgGKA+fn80cK2ZW9fNLjSqv5JQa2ppb/LJNs+ZiCKQVmNA/jP
 dYNI++d3sc+lrNlGHjZPQ8NQBJYxMuOC9OKTLC4vEJQ8Cc2hXUKlAYCqIvTFDoKP
 a21ynsDe8L+CDAKT6u3jSiUbNUlhl8cck4pFsRbVZYiikuBidtlHHLoXmHjeSayZ
 N8eiCeeh
 =15Z+
 -----END PGP SIGNATURE-----

parent befbb42
author Kartik Gupta <88345179+kartikgupta-db@users.noreply.github.com> 1688609182 +0200
committer Spece <Michael.Spece@Chubb.Com> 1689629116 -0400
gpgsig -----BEGIN PGP SIGNATURE-----

 iQGzBAABCAAdFiEEmFkdmlNpeHaohb9psG3XEklX7TkFAmS1sbwACgkQsG3XEklX
 7TnEJgv+PhwF46QO+N5yEQNdzX8sCi/7pkZePiELGSzjkqEL75wNxrYX22PKaW6V
 2ThXk7wczONZYozzadZzB72uZ+jqm5xAtr/QaOZhz8h/xJ79IsuGT+rtA198mCjv
 k+G/2iZzAb5Jcs08X58YrZJCYPQDPTXmElyRUskMhiO2wjmVgcL80JpHk4UKLBfU
 2m3ZJY/ZSBFfBdrHCqHUVfyq8KgC7dcxEApgX4ZNb0eE0wC9PbtrfIzgkrTGuMPE
 lD0Vp7QIAJozWeO5SJe7HHxSQl3qTSiADjZC2wMQQ4a87eKC6g0hDKt7rTGo75s0
 p9UO+MUJHh3/QcHkSIzRYOzkPjLOGIhpnMWAufJNOxRVJdiT0xuDFo2bOZ6JMgRV
 u2BrGa1ujC+sAgmicuAtBZeC5lf8mKpBgW2/VWFZN7UOMeSxZewgSd9G4xLxGOvt
 C2htT9yH4zsjHZLAlUqQp8LN4QjCo3lSB/0CBXMNq2FjCI2FKhdjScyJ4hxd++Ol
 M01P9FU/
 =F3cq
 -----END PGP SIGNATURE-----

[DECO-1115] Add local implementation for `dbutils.widgets` (databricks#93)

* Added a new install group (`pip install 'databricks-sdk[notebook]'`).
This allows us to safely pin ipywidgets for local installs. DBR can
safely continue using `pip install databricks-sdk` or directly using the
default build from master without conflicting dependencies.
* OSS implementation of widgets is imported only on first use (possible
only through OSS implementation of dbutils - `RemoteDbutils`).
* Add a wrapper for ipywidgets to enable interactive widgets when in
interactive **IPython** notebooks.

https://user-images.githubusercontent.com/88345179/236443693-1c804107-ba21-4296-ba40-2b1e8e062d16.mov

* Add default widgets implementation that returns a default value, when
not in an interactive environment.

https://user-images.githubusercontent.com/88345179/236443729-51185404-4d28-49c6-ade0-a665e154e092.mov

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Fix error message, ExportFormat -> ImportFormat (databricks#220)

The proper argument is ImportFormat.AUTO, not ExportFormat.AUTO

Correct the error message when `ImportFormat` is not provided to
`workspace.upload`.

Signed-off-by: Jessica Smith <8505845+NodeJSmith@users.noreply.github.com>

Regenerate Python SDK using recent OpenAPI Specification (databricks#229)

Spec commit sha: 17a3f7fe6 (7 July 2023)

Breaking Changes:
* Use CONSTANT_CASE for Enum constants. Many enums already use constant
case in their definition, but some constants (like the SCIM Patch schema
name) includes symbols `:` and numbers, so the SDK cannot use the enum
value as the name.
* Replace Query type with AlertQuery in sql.Alert class.
* Removal of User.is_db_admin and User.profile_image_url.

Changes:
* Introduce CleanRooms API
* Introduce TablesAPI.update()
* Introduce Group.meta property
* Fix SCIM Patch implementation
* Introduce BaseRun.job_parameters and BaseRun.trigger_info
* Introduce CreateJob.parameters
* Fix spelling in file arrival trigger configuration
* Introduce GitSource.job_source
* Introduce RepairRun.rerun_dependent_tasks
* Introduce Resolved*Values classes, RunIf, and RunJobTask
* Introduce TaskNotificationSettings

Later follow-up:
* Names should split on Pascal-case word boundaries (see
CloudProviderNodeStatus). This is an OpenAPI code gen change that needs
to be made.

Make workspace client also return runtime dbutils when in dbr (databricks#210)

* `workspace_client.dbutils` always returned oss implementation of
dbutils. We want it to also use dbr implementation when in dbr.

* [x] Manually Test in dbr
* [x] Test locally

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Use .ConstantName defining target enum states for waiters (databricks#230)

Uses of enums in generated code need to be updated to use
`{{.ConstantName}}` instead of `{{.Content}}`.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

Fix enum deserialization (databricks#234)

In databricks#230, enums were changed so that enum field names did not necessarily
match the enum value itself. However, the `_enum` helper method used
during deserialization of a response containing an enum was not updated
to handle this case. This PR corrects this method to check through the
values of the `__members__` of an enum, as opposed to the keys.

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

Fix enum deserialization, take 2 (databricks#235)

We jumped the gun too quickly on databricks#234. This is the actual change which
fixes the integration tests.

- [x] The two failing integration tests (test_submitting_jobs and
test_proxy_dbfs_mounts) both pass on this PR.

Added toolchain configuration to `.codegen.json` (databricks#236)

- Added toolchain config for automated releases
- Added `CHANGELOG.md` template with OpenAPI SHA

prep release changes

Make OpenAPI spec location configurable (databricks#237)

Introducing `DATABRICKS_OPENAPI_SPEC` environment variable to hold a
filesystem location of `all-internal.json` spec.

Rearrange imports in `databricks.sdk.runtime` to improve local editor experience (databricks#219)

<!-- Summary of your changes that are easy to understand -->

The type hinting here means that VSCode does not give useful syntax
highlights / code completions.

This is the current experience on `main`:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e">
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9">

With these changes this becomes:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80">
<img width="818" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49">

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

[DECO-1115] Add local implementation for `dbutils.widgets` (databricks#93)

* Added a new install group (`pip install 'databricks-sdk[notebook]'`).
This allows us to safely pin ipywidgets for local installs. DBR can
safely continue using `pip install databricks-sdk` or directly using the
default build from master without conflicting dependencies.
* OSS implementation of widgets is imported only on first use (possible
only through OSS implementation of dbutils - `RemoteDbutils`).
* Add a wrapper for ipywidgets to enable interactive widgets when in
interactive **IPython** notebooks.

https://user-images.githubusercontent.com/88345179/236443693-1c804107-ba21-4296-ba40-2b1e8e062d16.mov

* Add default widgets implementation that returns a default value, when
not in an interactive environment.

https://user-images.githubusercontent.com/88345179/236443729-51185404-4d28-49c6-ade0-a665e154e092.mov

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Fix error message, ExportFormat -> ImportFormat (databricks#220)

The proper argument is ImportFormat.AUTO, not ExportFormat.AUTO

Correct the error message when `ImportFormat` is not provided to
`workspace.upload`.

Signed-off-by: Jessica Smith <8505845+NodeJSmith@users.noreply.github.com>

Regenerate Python SDK using recent OpenAPI Specification (databricks#229)

Spec commit sha: 17a3f7fe6 (7 July 2023)

Breaking Changes:
* Use CONSTANT_CASE for Enum constants. Many enums already use constant
case in their definition, but some constants (like the SCIM Patch schema
name) includes symbols `:` and numbers, so the SDK cannot use the enum
value as the name.
* Replace Query type with AlertQuery in sql.Alert class.
* Removal of User.is_db_admin and User.profile_image_url.

Changes:
* Introduce CleanRooms API
* Introduce TablesAPI.update()
* Introduce Group.meta property
* Fix SCIM Patch implementation
* Introduce BaseRun.job_parameters and BaseRun.trigger_info
* Introduce CreateJob.parameters
* Fix spelling in file arrival trigger configuration
* Introduce GitSource.job_source
* Introduce RepairRun.rerun_dependent_tasks
* Introduce Resolved*Values classes, RunIf, and RunJobTask
* Introduce TaskNotificationSettings

Later follow-up:
* Names should split on Pascal-case word boundaries (see
CloudProviderNodeStatus). This is an OpenAPI code gen change that needs
to be made.

Make workspace client also return runtime dbutils when in dbr (databricks#210)

* `workspace_client.dbutils` always returned oss implementation of
dbutils. We want it to also use dbr implementation when in dbr.

* [x] Manually Test in dbr
* [x] Test locally

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

Use .ConstantName defining target enum states for waiters (databricks#230)

Uses of enums in generated code need to be updated to use
`{{.ConstantName}}` instead of `{{.Content}}`.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

Fix enum deserialization, take 2 (databricks#235)

We jumped the gun too quickly on databricks#234. This is the actual change which
fixes the integration tests.

- [x] The two failing integration tests (test_submitting_jobs and
test_proxy_dbfs_mounts) both pass on this PR.

Added toolchain configuration to `.codegen.json` (databricks#236)

- Added toolchain config for automated releases
- Added `CHANGELOG.md` template with OpenAPI SHA

prep release changes

Make OpenAPI spec location configurable (databricks#237)

Introducing `DATABRICKS_OPENAPI_SPEC` environment variable to hold a
filesystem location of `all-internal.json` spec.

Rearrange imports in `databricks.sdk.runtime` to improve local editor experience (databricks#219)

<!-- Summary of your changes that are easy to understand -->

The type hinting here means that VSCode does not give useful syntax
highlights / code completions.

This is the current experience on `main`:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e">
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9">

With these changes this becomes:
<img width="428" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80">
<img width="818" alt="image"
src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49">

<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
@nfx nfx mentioned this pull request Jul 17, 2023
nfx added a commit that referenced this pull request Jul 18, 2023
* Add Issue Templates ([#208](#208)).
* Fixed notebook native auth for jobs ([#209](#209)).
* Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)).
* Support dod in python sdk ([#212](#212)).
* [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)).
* Fix error message, ExportFormat -> ImportFormat ([#220](#220)).
* Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)).
* Make workspace client also return runtime dbutils when in dbr ([#210](#210)).
* Use .ConstantName defining target enum states for waiters ([#230](#230)).
* Fix enum deserialization ([#234](#234)).
* Fix enum deserialization, take 2 ([#235](#235)).
* Added toolchain configuration to `.codegen.json` ([#236](#236)).
* Make OpenAPI spec location configurable ([#237](#237)).
* Rearrange imports in `databricks.sdk.runtime` to improve local editor experience ([#219](#219)).

API Changes:

 * Removed `maintenance()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service.
 * Added `enable_optimization()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service.
 * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service.
 * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountMetastoreRequest`.
 * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest`.
 * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenance` dataclass.
 * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenanceResponse` dataclass.
 * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimization` dataclass.
 * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` dataclass.
 * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass.
 * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`.
 * Added `databricks.sdk.service.iam.PatchSchema` dataclass.
 * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `health` field for `databricks.sdk.service.jobs.CreateJob`.
 * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`.
 * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`.
 * Added `health` field for `databricks.sdk.service.jobs.JobSettings`.
 * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`.
 * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`.
 * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`.
 * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`.
 * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`.
 * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `health` field for `databricks.sdk.service.jobs.Task`.
 * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`.
 * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`.
 * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`.
 * Added `databricks.sdk.service.jobs.JobSource` dataclass.
 * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass.
 * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass.
 * Added `databricks.sdk.service.jobs.RunJobTask` dataclass.
 * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass.
 * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass.
 * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`.
 * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order.
 * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order.
 * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required.
 * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required.
 * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service.
 * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnMask` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass.
 * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass.
 * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass.
 * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass.
 * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass.
 * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass.
 * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass.
 * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass.
 * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`.
 * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`.
 * Added `databricks.sdk.service.sql.AlertQuery` dataclass.

OpenAPI SHA: 36bb2292d778b9955eb3b799a39be94a83049b43, Date: 2023-07-18
nfx added a commit that referenced this pull request Jul 18, 2023
* Add Issue Templates ([#208](#208)).
* Fixed notebook native auth for jobs ([#209](#209)).
* Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)).
* Support dod in python sdk ([#212](#212)).
* [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)).
* Fix error message, ExportFormat -> ImportFormat ([#220](#220)).
* Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)).
* Make workspace client also return runtime dbutils when in dbr ([#210](#210)).
* Use .ConstantName defining target enum states for waiters ([#230](#230)).
* Fix enum deserialization ([#234](#234)).
* Fix enum deserialization, take 2 ([#235](#235)).
* Added toolchain configuration to `.codegen.json` ([#236](#236)).
* Make OpenAPI spec location configurable ([#237](#237)).
* Rearrange imports in `databricks.sdk.runtime` to improve local editor experience ([#219](#219)).
* Updated account-level and workspace-level user management examples ([#241](#241)).

API Changes:

 * Removed `maintenance()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service.
 * Added `enable_optimization()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service.
 * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service.
 * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountMetastoreRequest`.
 * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest`.
 * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenance` dataclass.
 * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenanceResponse` dataclass.
 * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimization` dataclass.
 * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` dataclass.
 * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass.
 * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`.
 * Added `databricks.sdk.service.iam.PatchSchema` dataclass.
 * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `health` field for `databricks.sdk.service.jobs.CreateJob`.
 * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`.
 * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`.
 * Added `health` field for `databricks.sdk.service.jobs.JobSettings`.
 * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`.
 * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`.
 * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`.
 * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`.
 * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`.
 * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `health` field for `databricks.sdk.service.jobs.Task`.
 * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`.
 * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`.
 * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`.
 * Added `databricks.sdk.service.jobs.JobSource` dataclass.
 * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass.
 * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass.
 * Added `databricks.sdk.service.jobs.RunJobTask` dataclass.
 * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass.
 * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass.
 * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`.
 * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order.
 * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order.
 * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required.
 * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required.
 * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service.
 * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnMask` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass.
 * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass.
 * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass.
 * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass.
 * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass.
 * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass.
 * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass.
 * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass.
 * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`.
 * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`.
 * Added `databricks.sdk.service.sql.AlertQuery` dataclass.

OpenAPI SHA: 0a1949ba96f71680dad30e06973eaae85b1307bb, Date: 2023-07-18
nfx added a commit that referenced this pull request Jul 18, 2023
* Add Issue Templates
([#208](#208)).
* Fixed notebook native auth for jobs
([#209](#209)).
* Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase
([#207](#207)).
* Support dod in python sdk
([#212](#212)).
* [DECO-1115] Add local implementation for `dbutils.widgets`
([#93](#93)).
* Fix error message, ExportFormat -> ImportFormat
([#220](#220)).
* Regenerate Python SDK using recent OpenAPI Specification
([#229](#229)).
* Make workspace client also return runtime dbutils when in dbr
([#210](#210)).
* Use .ConstantName defining target enum states for waiters
([#230](#230)).
* Fix enum deserialization
([#234](#234)).
* Fix enum deserialization, take 2
([#235](#235)).
* Added toolchain configuration to `.codegen.json`
([#236](#236)).
* Make OpenAPI spec location configurable
([#237](#237)).
* Rearrange imports in `databricks.sdk.runtime` to improve local editor
experience
([#219](#219)).
* Updated account-level and workspace-level user management examples
([#241](#241)).

API Changes:

* Removed `maintenance()` method for
[w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html)
workspace-level service.
* Added `enable_optimization()` method for
[w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html)
workspace-level service.
* Added `update()` method for
[w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html)
workspace-level service.
* Added `force` field for
`databricks.sdk.service.catalog.DeleteAccountMetastoreRequest`.
* Added `force` field for
`databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest`.
* Removed `databricks.sdk.service.catalog.UpdateAutoMaintenance`
dataclass.
* Removed `databricks.sdk.service.catalog.UpdateAutoMaintenanceResponse`
dataclass.
* Added `databricks.sdk.service.catalog.UpdatePredictiveOptimization`
dataclass.
* Added
`databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse`
dataclass.
 * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass.
 * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`.
 * Added `databricks.sdk.service.iam.PatchSchema` dataclass.
 * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `health` field for `databricks.sdk.service.jobs.CreateJob`.
 * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`.
* Added `on_duration_warning_threshold_exceeded` field for
`databricks.sdk.service.jobs.JobEmailNotifications`.
 * Added `health` field for `databricks.sdk.service.jobs.JobSettings`.
 * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`.
* Added `run_job_output` field for
`databricks.sdk.service.jobs.RunOutput`.
 * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`.
* Added `email_notifications` field for
`databricks.sdk.service.jobs.SubmitRun`.
 * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`.
* Added `email_notifications` field for
`databricks.sdk.service.jobs.SubmitTask`.
 * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`.
* Added `notification_settings` field for
`databricks.sdk.service.jobs.SubmitTask`.
 * Added `health` field for `databricks.sdk.service.jobs.Task`.
 * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`.
* Added `on_duration_warning_threshold_exceeded` field for
`databricks.sdk.service.jobs.TaskEmailNotifications`.
* Added `on_duration_warning_threshold_exceeded` field for
`databricks.sdk.service.jobs.WebhookNotifications`.
 * Added `databricks.sdk.service.jobs.JobSource` dataclass.
 * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass.
 * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass.
 * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass.
 * Added `databricks.sdk.service.jobs.RunJobTask` dataclass.
 * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass.
* Added
`databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem`
dataclass.
* Removed `whl` field for
`databricks.sdk.service.pipelines.PipelineLibrary`.
* Changed `delete_personal_compute_setting()` method for
[a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html)
account-level service with new required argument order.
* Changed `read_personal_compute_setting()` method for
[a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html)
account-level service with new required argument order.
* Changed `etag` field for
`databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to
be required.
* Changed `etag` field for
`databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to
be required.
* Added
[w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html)
workspace-level service.
* Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass.
* Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate`
dataclass.
* Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo`
dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass.
* Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo`
dataclass.
 * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnMask` dataclass.
 * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass.
 * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass.
* Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest`
dataclass.
 * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass.
* Added `databricks.sdk.service.sharing.ListCleanRoomsResponse`
dataclass.
 * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass.
* Changed `query` field for `databricks.sdk.service.sql.Alert` to
`databricks.sdk.service.sql.AlertQuery` dataclass.
* Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to
`any` dataclass.
 * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`.
* Removed `profile_image_url` field for
`databricks.sdk.service.sql.User`.
 * Added `databricks.sdk.service.sql.AlertQuery` dataclass.

OpenAPI SHA: 0a1949ba96f71680dad30e06973eaae85b1307bb, Date: 2023-07-18



[DECO-1115]:
https://databricks.atlassian.net/browse/DECO-1115?atlOrigin=eyJpIjoiNWRkNTljNzYxNjVmNDY3MDlhMDU5Y2ZhYzA5YTRkZjUiLCJwIjoiZ2l0aHViLWNvbS1KU1cifQ
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants