-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added bundle deployment bind
and unbind
command
#1131
Conversation
cmd/bundle/bind.go
Outdated
|
||
ctx := cmd.Context() | ||
if !autoApprove { | ||
answer, err := cmdio.AskYesOrNo(ctx, "Binding to existing resource means that the resource will be managed by the bundle which can lead to changes in the resource. Do you want to continue?") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should confirm that we would not recreate the resource on the TF side. Especially for pipelines this can be very dangerous where underlying tables get dropped if we do.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pietern I've added a step which shows terraform plan after import and asks user to confirm it, otherwise reverts the import
bundle bind
commandbundle deployment bind
command
bundle deployment bind
commandbundle deployment bind
and unbind
command
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #1131 +/- ##
==========================================
- Coverage 51.85% 51.28% -0.57%
==========================================
Files 299 301 +2
Lines 16537 16737 +200
==========================================
+ Hits 8575 8584 +9
- Misses 7340 7519 +179
- Partials 622 634 +12 ☔ View full report in Codecov by Sentry. |
bundle/deploy/terraform/import.go
Outdated
// This is necessary because the import operation requires the resource to be defined in the Terraform configuration. | ||
_, err = os.Stat(filepath.Join(dir, "bundle.tf.json")) | ||
if err != nil { | ||
err = bundle.Apply(ctx, b, Write()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't we always write it? Note that the Terraform interpolation mutator also needs to run.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pietern doing this will include all local changes to bundle config as well which I believe we should not include especially considering we push the state remotely
ResourceKey: args[0], | ||
ResourceId: args[1], | ||
}), | ||
)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this succeeds we should display a message (if in text output mode) to confirm it did.
This can also include a call to action to hint at running a deploy next.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@juliacrawf-db Could you shine your light on how to best convey this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh no! I missed this ping! That string looks great. But wouldn't we want a similar error message in unbind.go if the unbinding fails?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I propose we also add an integration test to check that aborting the import works as expected and doesn't have side effects.
Everything else LGTM.
if len(found) > 1 { | ||
keys := make([]string, 0, len(found)) | ||
for _, r := range found { | ||
keys = append(keys, fmt.Sprintf("%s:%s", r.TerraformResourceName(), key)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For later; we shouldn't externalize the TF resource name as it doesn't correspond with the name in the configuration. In error messages we should refer to jobs.job_key
like we do for the run args.
CLI: * Ignore environment variables for `auth profiles` ([#1189](#1189)). * Update LICENSE ([#1013](#1013)). Bundles: * Added `bundle deployment bind` and `unbind` command ([#1131](#1131)). * Use allowlist for Git-related fields to include in metadata ([#1187](#1187)). * Added `--restart` flag for `bundle run` command ([#1191](#1191)). * Generate correct YAML if custom_tags or spark_conf is used for pipeline or job cluster configuration ([#1210](#1210)). Internal: * Move folders package into libs ([#1184](#1184)). * Log time it takes for profile to load ([#1186](#1186)). * Use mockery to generate mocks compatible with testify/mock ([#1190](#1190)). * Retain partially valid structs in `convert.Normalize` ([#1203](#1203)). * Skip `for_each_task` when generating the bundle schema ([#1204](#1204)). * Regenerate the CLI using the same OpenAPI spec as the SDK ([#1205](#1205)). * Avoid race-conditions while executing sub-commands ([#1201](#1201)). API Changes: * Changed `databricks connections delete` command with new required argument order. * Changed `databricks connections get` command with new required argument order. * Changed `databricks connections update` command with new required argument order. * Added `databricks tables exists` command. * Changed `databricks volumes delete` command with new required argument order. * Changed `databricks volumes read` command with new required argument order. * Changed `databricks volumes update` command with new required argument order. * Added `databricks lakehouse-monitors` command group. * Removed `databricks files get-status` command. * Added `databricks files create-directory` command. * Added `databricks files delete-directory` command. * Added `databricks files get-directory-metadata` command. * Added `databricks files get-metadata` command. * Added `databricks files list-directory-contents` command. * Removed `databricks pipelines reset` command. * Changed `databricks account settings delete-personal-compute-setting` command with new required argument order. * Removed `databricks account settings read-personal-compute-setting` command. * Changed `databricks account settings update-personal-compute-setting` command with new required argument order. * Added `databricks account settings get-personal-compute-setting` command. * Removed `databricks settings delete-default-workspace-namespace` command. * Removed `databricks settings read-default-workspace-namespace` command. * Removed `databricks settings update-default-workspace-namespace` command. * Added `databricks settings delete-default-namespace-setting` command. * Added `databricks settings delete-restrict-workspace-admins-setting` command. * Added `databricks settings get-default-namespace-setting` command. * Added `databricks settings get-restrict-workspace-admins-setting` command. * Added `databricks settings update-default-namespace-setting` command. * Added `databricks settings update-restrict-workspace-admins-setting` command. * Changed `databricks token-management create-obo-token` command with new required argument order. * Changed `databricks token-management get` command to return . * Changed `databricks clean-rooms delete` command with new required argument order. * Changed `databricks clean-rooms get` command with new required argument order. * Changed `databricks clean-rooms update` command with new required argument order. * Changed `databricks dashboards create` command . New request type is . * Added `databricks dashboards update` command. OpenAPI commit c40670f5a2055c92cf0a6aac92a5bccebfb80866 (2024-02-14) Dependency updates: * Bump github.com/hashicorp/hc-install from 0.6.2 to 0.6.3 ([#1200](#1200)). * Bump golang.org/x/term from 0.16.0 to 0.17.0 ([#1197](#1197)). * Bump golang.org/x/oauth2 from 0.16.0 to 0.17.0 ([#1198](#1198)). * Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 ([#1199](#1199)).
CLI: * Ignore environment variables for `auth profiles` ([#1189](#1189)). * Update LICENSE file to match Databricks license language ([#1013](#1013)). Bundles: * Added `bundle deployment bind` and `unbind` command ([#1131](#1131)). * Use allowlist for Git-related fields to include in metadata ([#1187](#1187)). * Added `--restart` flag for `bundle run` command ([#1191](#1191)). * Generate correct YAML if `custom_tags` or `spark_conf` is used for pipeline or job cluster configuration ([#1210](#1210)). Internal: * Move folders package into libs ([#1184](#1184)). * Log time it takes for profile to load ([#1186](#1186)). * Use mockery to generate mocks compatible with testify/mock ([#1190](#1190)). * Retain partially valid structs in `convert.Normalize` ([#1203](#1203)). * Skip `for_each_task` when generating the bundle schema ([#1204](#1204)). * Regenerate the CLI using the same OpenAPI spec as the SDK ([#1205](#1205)). * Avoid race-conditions while executing sub-commands ([#1201](#1201)). API Changes: * Added `databricks tables exists` command. * Added `databricks lakehouse-monitors` command group. * Removed `databricks files get-status` command. * Added `databricks files create-directory` command. * Added `databricks files delete-directory` command. * Added `databricks files get-directory-metadata` command. * Added `databricks files get-metadata` command. * Added `databricks files list-directory-contents` command. * Removed `databricks pipelines reset` command. * Changed `databricks account settings delete-personal-compute-setting` command with new required argument order. * Removed `databricks account settings read-personal-compute-setting` command. * Changed `databricks account settings update-personal-compute-setting` command with new required argument order. * Added `databricks account settings get-personal-compute-setting` command. * Removed `databricks settings delete-default-workspace-namespace` command. * Removed `databricks settings read-default-workspace-namespace` command. * Removed `databricks settings update-default-workspace-namespace` command. * Added `databricks settings delete-default-namespace-setting` command. * Added `databricks settings delete-restrict-workspace-admins-setting` command. * Added `databricks settings get-default-namespace-setting` command. * Added `databricks settings get-restrict-workspace-admins-setting` command. * Added `databricks settings update-default-namespace-setting` command. * Added `databricks settings update-restrict-workspace-admins-setting` command. * Changed `databricks token-management create-obo-token` command with new required argument order. * Changed `databricks token-management get` command to return . * Changed `databricks dashboards create` command . New request type is . * Added `databricks dashboards update` command. OpenAPI commit c40670f5a2055c92cf0a6aac92a5bccebfb80866 (2024-02-14) Dependency updates: * Bump github.com/hashicorp/hc-install from 0.6.2 to 0.6.3 ([#1200](#1200)). * Bump golang.org/x/term from 0.16.0 to 0.17.0 ([#1197](#1197)). * Bump golang.org/x/oauth2 from 0.16.0 to 0.17.0 ([#1198](#1198)). * Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 ([#1199](#1199)). --------- Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Changes
Added
bundle deployment bind
andunbind
command.This command allows to bind bundle-defined resources to existing resources in Databricks workspace so they become DABs-managed.
Tests
Manually + added E2E test