Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update databricks-sdk requirement from ~=0.9.0 to ~=0.10.0 #362

Merged
merged 1 commit into from
Oct 3, 2023

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Oct 3, 2023

Updates the requirements on databricks-sdk to permit the latest version.

Release notes

Sourced from databricks-sdk's releases.

v0.10.0

  • Respect retry_timeout_seconds config setting and align retry implementation with Go SDK (#337).

Breaking API Changes:

  • Changed list() method for a.account_metastore_assignments account-level service to return databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse dataclass.
  • Removed owner field for databricks.sdk.service.catalog.CreateConnection. Instead, use the owner field of UpdateConnection.
  • Removed options field for databricks.sdk.service.catalog.UpdateCatalog.
  • Changed job_parameters field for databricks.sdk.service.jobs.RunNow to databricks.sdk.service.jobs.ParamPairs dataclass.
  • Changed query() method for w.serving_endpoints workspace-level service . New request type is databricks.sdk.service.serving.QueryEndpointInput dataclass.
  • Renamed databricks.sdk.service.serving.QueryRequest dataclass to QueryEndpointInput.
  • Changed list() method for w.clean_rooms workspace-level service to require request of databricks.sdk.service.sharing.ListCleanRoomsRequest dataclass.

API Changes:

  • Added databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse dataclass.
  • Added job_parameters field for databricks.sdk.service.jobs.RepairRun.
  • Added job_parameters field for databricks.sdk.service.jobs.RunParameters.
  • Added notifications field for databricks.sdk.service.pipelines.CreatePipeline.
  • Added notifications field for databricks.sdk.service.pipelines.EditPipeline.
  • Added notifications field for databricks.sdk.service.pipelines.PipelineSpec.
  • Added databricks.sdk.service.pipelines.Notifications dataclass.
  • Added databricks.sdk.service.serving.DataframeSplitInput dataclass.
  • Added w.settings workspace-level service.
  • Added databricks.sdk.service.settings.DefaultNamespaceSetting dataclass.
  • Added databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceRequest dataclass.
  • Added databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceResponse dataclass.
  • Added databricks.sdk.service.settings.ReadDefaultWorkspaceNamespaceRequest dataclass.
  • Added databricks.sdk.service.settings.StringMessage dataclass.
  • Added databricks.sdk.service.settings.UpdateDefaultWorkspaceNamespaceRequest dataclass.
  • Added next_page_token field for databricks.sdk.service.sharing.ListCleanRoomsResponse.
  • Added databricks.sdk.service.sharing.ListCleanRoomsRequest dataclass.

OpenAPI SHA: bcbf6e851e3d82fd910940910dd31c10c059746c, Date: 2023-10-02

Changelog

Sourced from databricks-sdk's changelog.

0.10.0

  • Respect retry_timeout_seconds config setting and align retry implementation with Go SDK (#337).

Breaking API Changes:

  • Changed list() method for a.account_metastore_assignments account-level service to return databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse dataclass.
  • Removed owner field for databricks.sdk.service.catalog.CreateConnection. Instead, use the owner field of UpdateConnection.
  • Removed options field for databricks.sdk.service.catalog.UpdateCatalog.
  • Changed job_parameters field for databricks.sdk.service.jobs.RunNow to databricks.sdk.service.jobs.ParamPairs dataclass.
  • Changed query() method for w.serving_endpoints workspace-level service . New request type is databricks.sdk.service.serving.QueryEndpointInput dataclass.
  • Renamed databricks.sdk.service.serving.QueryRequest dataclass to QueryEndpointInput.
  • Changed list() method for w.clean_rooms workspace-level service to require request of databricks.sdk.service.sharing.ListCleanRoomsRequest dataclass.

API Changes:

  • Added databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse dataclass.
  • Added job_parameters field for databricks.sdk.service.jobs.RepairRun.
  • Added job_parameters field for databricks.sdk.service.jobs.RunParameters.
  • Added notifications field for databricks.sdk.service.pipelines.CreatePipeline.
  • Added notifications field for databricks.sdk.service.pipelines.EditPipeline.
  • Added notifications field for databricks.sdk.service.pipelines.PipelineSpec.
  • Added databricks.sdk.service.pipelines.Notifications dataclass.
  • Added databricks.sdk.service.serving.DataframeSplitInput dataclass.
  • Added w.settings workspace-level service.
  • Added databricks.sdk.service.settings.DefaultNamespaceSetting dataclass.
  • Added databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceRequest dataclass.
  • Added databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceResponse dataclass.
  • Added databricks.sdk.service.settings.ReadDefaultWorkspaceNamespaceRequest dataclass.
  • Added databricks.sdk.service.settings.StringMessage dataclass.
  • Added databricks.sdk.service.settings.UpdateDefaultWorkspaceNamespaceRequest dataclass.
  • Added next_page_token field for databricks.sdk.service.sharing.ListCleanRoomsResponse.
  • Added databricks.sdk.service.sharing.ListCleanRoomsRequest dataclass.

OpenAPI SHA: bcbf6e851e3d82fd910940910dd31c10c059746c, Date: 2023-10-02

0.9.0

  • Don't try to import runtime_auth when not in runtime (#327).
  • Handled Azure authentication when WorkspaceResourceID is provided (#328).
  • Added ErrorInfo to API errors (#347).
  • Fixed eager default argument evaluation in DatabricksError (#353).
  • Fixed code generation of primitive types (#354).
  • Updated SDK to changes in OpenAPI specification (#355).

API Changes:

  • Changed list() method for a.account_metastore_assignments account-level service to return databricks.sdk.service.catalog.WorkspaceIdList dataclass.
  • Changed artifact_matchers field for databricks.sdk.service.catalog.ArtifactAllowlistInfo to databricks.sdk.service.catalog.ArtifactMatcherList dataclass.
  • Changed artifact_matchers field for databricks.sdk.service.catalog.SetArtifactAllowlist to databricks.sdk.service.catalog.ArtifactMatcherList dataclass.

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version.
- [Release notes](https://github.com/databricks/databricks-sdk-py/releases)
- [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md)
- [Commits](databricks/databricks-sdk-py@v0.9.0...v0.10.0)

---
updated-dependencies:
- dependency-name: databricks-sdk
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Oct 3, 2023
@codecov
Copy link

codecov bot commented Oct 3, 2023

Codecov Report

Merging #362 (5ce2a06) into main (cdc8b52) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main     #362   +/-   ##
=======================================
  Coverage   83.56%   83.56%           
=======================================
  Files          30       30           
  Lines        2263     2263           
  Branches      394      394           
=======================================
  Hits         1891     1891           
  Misses        287      287           
  Partials       85       85           

@nfx nfx added this pull request to the merge queue Oct 3, 2023
Merged via the queue into main with commit 2b88750 Oct 3, 2023
5 checks passed
@dependabot dependabot bot deleted the dependabot/pip/databricks-sdk-approx-eq-0.10.0 branch October 3, 2023 12:20
zpappa pushed a commit that referenced this pull request Oct 3, 2023
Updates the requirements on
[databricks-sdk](https://github.com/databricks/databricks-sdk-py) to
permit the latest version.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-py/releases">databricks-sdk's
releases</a>.</em></p>
<blockquote>
<h2>v0.10.0</h2>
<ul>
<li>Respect <code>retry_timeout_seconds</code> config setting and align
retry implementation with Go SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/337">#337</a>).</li>
</ul>
<p>Breaking API Changes:</p>
<ul>
<li>Changed <code>list()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_metastore_assignments.html">a.account_metastore_assignments</a>
account-level service to return
<code>databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse</code>
dataclass.</li>
<li>Removed <code>owner</code> field for
<code>databricks.sdk.service.catalog.CreateConnection</code>. Instead,
use the <code>owner</code> field of <code>UpdateConnection</code>.</li>
<li>Removed <code>options</code> field for
<code>databricks.sdk.service.catalog.UpdateCatalog</code>.</li>
<li>Changed <code>job_parameters</code> field for
<code>databricks.sdk.service.jobs.RunNow</code> to
<code>databricks.sdk.service.jobs.ParamPairs</code> dataclass.</li>
<li>Changed <code>query()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html">w.serving_endpoints</a>
workspace-level service . New request type is
<code>databricks.sdk.service.serving.QueryEndpointInput</code>
dataclass.</li>
<li>Renamed <code>databricks.sdk.service.serving.QueryRequest</code>
dataclass to <code>QueryEndpointInput</code>.</li>
<li>Changed <code>list()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html">w.clean_rooms</a>
workspace-level service to require request of
<code>databricks.sdk.service.sharing.ListCleanRoomsRequest</code>
dataclass.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Added
<code>databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse</code>
dataclass.</li>
<li>Added <code>job_parameters</code> field for
<code>databricks.sdk.service.jobs.RepairRun</code>.</li>
<li>Added <code>job_parameters</code> field for
<code>databricks.sdk.service.jobs.RunParameters</code>.</li>
<li>Added <code>notifications</code> field for
<code>databricks.sdk.service.pipelines.CreatePipeline</code>.</li>
<li>Added <code>notifications</code> field for
<code>databricks.sdk.service.pipelines.EditPipeline</code>.</li>
<li>Added <code>notifications</code> field for
<code>databricks.sdk.service.pipelines.PipelineSpec</code>.</li>
<li>Added <code>databricks.sdk.service.pipelines.Notifications</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.serving.DataframeSplitInput</code>
dataclass.</li>
<li>Added <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html">w.settings</a>
workspace-level service.</li>
<li>Added
<code>databricks.sdk.service.settings.DefaultNamespaceSetting</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceRequest</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceResponse</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.ReadDefaultWorkspaceNamespaceRequest</code>
dataclass.</li>
<li>Added <code>databricks.sdk.service.settings.StringMessage</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.UpdateDefaultWorkspaceNamespaceRequest</code>
dataclass.</li>
<li>Added <code>next_page_token</code> field for
<code>databricks.sdk.service.sharing.ListCleanRoomsResponse</code>.</li>
<li>Added
<code>databricks.sdk.service.sharing.ListCleanRoomsRequest</code>
dataclass.</li>
</ul>
<p>OpenAPI SHA: bcbf6e851e3d82fd910940910dd31c10c059746c, Date:
2023-10-02</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md">databricks-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>0.10.0</h2>
<ul>
<li>Respect <code>retry_timeout_seconds</code> config setting and align
retry implementation with Go SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/337">#337</a>).</li>
</ul>
<p>Breaking API Changes:</p>
<ul>
<li>Changed <code>list()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_metastore_assignments.html">a.account_metastore_assignments</a>
account-level service to return
<code>databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse</code>
dataclass.</li>
<li>Removed <code>owner</code> field for
<code>databricks.sdk.service.catalog.CreateConnection</code>. Instead,
use the <code>owner</code> field of <code>UpdateConnection</code>.</li>
<li>Removed <code>options</code> field for
<code>databricks.sdk.service.catalog.UpdateCatalog</code>.</li>
<li>Changed <code>job_parameters</code> field for
<code>databricks.sdk.service.jobs.RunNow</code> to
<code>databricks.sdk.service.jobs.ParamPairs</code> dataclass.</li>
<li>Changed <code>query()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html">w.serving_endpoints</a>
workspace-level service . New request type is
<code>databricks.sdk.service.serving.QueryEndpointInput</code>
dataclass.</li>
<li>Renamed <code>databricks.sdk.service.serving.QueryRequest</code>
dataclass to <code>QueryEndpointInput</code>.</li>
<li>Changed <code>list()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html">w.clean_rooms</a>
workspace-level service to require request of
<code>databricks.sdk.service.sharing.ListCleanRoomsRequest</code>
dataclass.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Added
<code>databricks.sdk.service.catalog.ListAccountMetastoreAssignmentsResponse</code>
dataclass.</li>
<li>Added <code>job_parameters</code> field for
<code>databricks.sdk.service.jobs.RepairRun</code>.</li>
<li>Added <code>job_parameters</code> field for
<code>databricks.sdk.service.jobs.RunParameters</code>.</li>
<li>Added <code>notifications</code> field for
<code>databricks.sdk.service.pipelines.CreatePipeline</code>.</li>
<li>Added <code>notifications</code> field for
<code>databricks.sdk.service.pipelines.EditPipeline</code>.</li>
<li>Added <code>notifications</code> field for
<code>databricks.sdk.service.pipelines.PipelineSpec</code>.</li>
<li>Added <code>databricks.sdk.service.pipelines.Notifications</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.serving.DataframeSplitInput</code>
dataclass.</li>
<li>Added <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html">w.settings</a>
workspace-level service.</li>
<li>Added
<code>databricks.sdk.service.settings.DefaultNamespaceSetting</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceRequest</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.DeleteDefaultWorkspaceNamespaceResponse</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.ReadDefaultWorkspaceNamespaceRequest</code>
dataclass.</li>
<li>Added <code>databricks.sdk.service.settings.StringMessage</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.settings.UpdateDefaultWorkspaceNamespaceRequest</code>
dataclass.</li>
<li>Added <code>next_page_token</code> field for
<code>databricks.sdk.service.sharing.ListCleanRoomsResponse</code>.</li>
<li>Added
<code>databricks.sdk.service.sharing.ListCleanRoomsRequest</code>
dataclass.</li>
</ul>
<p>OpenAPI SHA: bcbf6e851e3d82fd910940910dd31c10c059746c, Date:
2023-10-02</p>
<h2>0.9.0</h2>
<ul>
<li>Don't try to import runtime_auth when not in runtime (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/327">#327</a>).</li>
<li>Handled Azure authentication when WorkspaceResourceID is provided
(<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/328">#328</a>).</li>
<li>Added ErrorInfo to API errors (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/347">#347</a>).</li>
<li>Fixed eager default argument evaluation in
<code>DatabricksError</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/353">#353</a>).</li>
<li>Fixed code generation of primitive types (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/354">#354</a>).</li>
<li>Updated SDK to changes in OpenAPI specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/355">#355</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>list()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_metastore_assignments.html">a.account_metastore_assignments</a>
account-level service to return
<code>databricks.sdk.service.catalog.WorkspaceIdList</code>
dataclass.</li>
<li>Changed <code>artifact_matchers</code> field for
<code>databricks.sdk.service.catalog.ArtifactAllowlistInfo</code> to
<code>databricks.sdk.service.catalog.ArtifactMatcherList</code>
dataclass.</li>
<li>Changed <code>artifact_matchers</code> field for
<code>databricks.sdk.service.catalog.SetArtifactAllowlist</code> to
<code>databricks.sdk.service.catalog.ArtifactMatcherList</code>
dataclass.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/90f6fab2d874180d36a2f1ee811ae9c19a2bc65f"><code>90f6fab</code></a>
Release v0.10.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/372">#372</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/bbe81c13c4a0275020d241b78f61cdddf349adc2"><code>bbe81c1</code></a>
Update OpenAPI spec to 2 Oct 2023 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/373">#373</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/6254119a641e3a9f68a62ea838bd43f1d883f022"><code>6254119</code></a>
Respect <code>retry_timeout_seconds</code> config setting and align
retry implementation...</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-py/compare/v0.9.0...v0.10.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
@nfx nfx linked an issue Oct 3, 2023 that may be closed by this pull request
nfx added a commit that referenced this pull request Oct 3, 2023
* Added `inventory_database` name check during installation ([#275](#275)).
* Added a column to `$inventory.tables` to specify if a table might have been synchronised to Unity Catalog already or not ([#306](#306)).
* Added a migration state to skip already migrated tables ([#325](#325)).
* Fixed appending to tables by adding filtering of `None` rows ([#356](#356)).
* Fixed handling of missing but linked cluster policies. ([#361](#361)).
* Ignore errors for Redash widgets and queries redeployment during installation ([#367](#367)).
* Remove exception and added proper logging for groups in the list that… ([#357](#357)).
* Skip group migration when no groups are available after preparation step. ([#363](#363)).
* Update databricks-sdk requirement from ~=0.9.0 to ~=0.10.0 ([#362](#362)).
@nfx nfx mentioned this pull request Oct 3, 2023
nfx added a commit that referenced this pull request Oct 3, 2023
* Added `inventory_database` name check during installation
([#275](#275)).
* Added a column to `$inventory.tables` to specify if a table might have
been synchronised to Unity Catalog already or not
([#306](#306)).
* Added a migration state to skip already migrated tables
([#325](#325)).
* Fixed appending to tables by adding filtering of `None` rows
([#356](#356)).
* Fixed handling of missing but linked cluster policies.
([#361](#361)).
* Ignore errors for Redash widgets and queries redeployment during
installation ([#367](#367)).
* Remove exception and added proper logging for groups in the list that…
([#357](#357)).
* Skip group migration when no groups are available after preparation
step. ([#363](#363)).
* Update databricks-sdk requirement from ~=0.9.0 to ~=0.10.0
([#362](#362)).
zpappa pushed a commit that referenced this pull request Oct 4, 2023
* Added `inventory_database` name check during installation
([#275](#275)).
* Added a column to `$inventory.tables` to specify if a table might have
been synchronised to Unity Catalog already or not
([#306](#306)).
* Added a migration state to skip already migrated tables
([#325](#325)).
* Fixed appending to tables by adding filtering of `None` rows
([#356](#356)).
* Fixed handling of missing but linked cluster policies.
([#361](#361)).
* Ignore errors for Redash widgets and queries redeployment during
installation ([#367](#367)).
* Remove exception and added proper logging for groups in the list that…
([#357](#357)).
* Skip group migration when no groups are available after preparation
step. ([#363](#363)).
* Update databricks-sdk requirement from ~=0.9.0 to ~=0.10.0
([#362](#362)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

Tool is failing creating new groups
1 participant