Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE] Issue with databricks_mws_workspaces resource regarding the key_id #4491

Open
LittleWat opened this issue Feb 12, 2025 · 0 comments
Open

Comments

@LittleWat
Copy link
Contributor

LittleWat commented Feb 12, 2025

Configuration

resource "databricks_mws_workspaces" "this" {
  provider       = databricks.mws
  account_id     = var.databricks_account_id
  aws_region     = var.aws_region
  workspace_name = local.workspace_name_with_env

  credentials_id                           = databricks_mws_credentials.this.credentials_id
  storage_configuration_id                 = databricks_mws_storage_configurations.this.storage_configuration_id
  network_id                               = databricks_mws_networks.privatelink_enabled.network_id
  private_access_settings_id               = databricks_mws_private_access_settings.pas.private_access_settings_id
  managed_services_customer_managed_key_id = databricks_mws_customer_managed_keys.dbx_cmk.customer_managed_key_id # this field  was newly added
  storage_customer_managed_key_id          = databricks_mws_customer_managed_keys.dbx_cmk.customer_managed_key_id  # this field was 
 newly added
}

Actual Behavior

When applying this, it failed. This is okay:

* Failed to execute "tofu apply -auto-approve -input=false -auto-approve" in <my_path>
  ╷
  │ Error: cannot update mws workspaces: INVALID_STATE: Please terminate all pool & cluster EC2 instances in your workspace subnets before attempting to update the workspace.
  │ 
  │   with databricks_mws_workspaces.this,
  │   on main.tf line 126, in resource "databricks_mws_workspaces" "this":
  │  126: resource "databricks_mws_workspaces" "this" {
  │ 

After terminating all pool & cluster EC2 instances in the target workspace subnets, I've applied this again but it said "No changes" ....

so I needed to run the following command to reflect this change.

databricks account workspaces update <workspace-id> --json '{
  "managed_services_customer_managed_key_id": "<databricks-encryption-key-id>",
  "storage-customer-managed-key-id": "<databricks-encryption-key-id>"
}'

Expected Behavior

Even after failing in applying once, it should detect the change in databricks_mws_workspaces

Steps to Reproduce

1 terraform apply the following:

resource "databricks_mws_workspaces" "this" {
  provider       = databricks.mws
  account_id     = var.databricks_account_id
  aws_region     = var.aws_region
  workspace_name = local.workspace_name_with_env

  credentials_id                           = databricks_mws_credentials.this.credentials_id
  storage_configuration_id                 = databricks_mws_storage_configurations.this.storage_configuration_id
  network_id                               = databricks_mws_networks.privatelink_enabled.network_id
  private_access_settings_id               = databricks_mws_private_access_settings.pas.private_access_settings_id
}
  1. after creating the workspace, run terraform apply the following while having some computes in the subnet, which will fail:
resource "databricks_mws_workspaces" "this" {
  provider       = databricks.mws
  account_id     = var.databricks_account_id
  aws_region     = var.aws_region
  workspace_name = local.workspace_name_with_env

  credentials_id                           = databricks_mws_credentials.this.credentials_id
  storage_configuration_id                 = databricks_mws_storage_configurations.this.storage_configuration_id
  network_id                               = databricks_mws_networks.privatelink_enabled.network_id
  private_access_settings_id               = databricks_mws_private_access_settings.pas.private_access_settings_id
  managed_services_customer_managed_key_id = databricks_mws_customer_managed_keys.dbx_cmk.customer_managed_key_id # this field  was newly added
  storage_customer_managed_key_id          = databricks_mws_customer_managed_keys.dbx_cmk.customer_managed_key_id  # this field was 
 newly added
}
  1. run terraform apply again, you will get "No changes"

Terraform and provider versions

the latest. (v1.65.1)

Is it a regression?

Debug Output

Important Factoids

Would you like to implement a fix?

yes if possible

@LittleWat LittleWat changed the title [ISSUE] Issue with databricks_mws_workspaces resource [ISSUE] Issue with databricks_mws_workspaces resource regarding the key_id Feb 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant