Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Notification Destination resource #3820

Merged
merged 23 commits into from
Aug 1, 2024

Conversation

Divyansh-db
Copy link
Contributor

@Divyansh-db Divyansh-db commented Jul 25, 2024

Changes

Made a new Notification Destination resource

  • relevant change in docs/ folder
  • covered with integration tests in internal/acceptance
  • relevant acceptance tests are passing
  • using Go SDK

@Divyansh-db Divyansh-db changed the title [FEATURE] Notification Destination resource [Feature] Notification Destination resource Jul 25, 2024
@Divyansh-db Divyansh-db requested a review from alexott July 25, 2024 08:53
@Divyansh-db Divyansh-db marked this pull request as ready for review July 25, 2024 12:48
@Divyansh-db Divyansh-db requested review from a team as code owners July 25, 2024 12:48
Comment on lines +42 to +46
s.SchemaPath("config", "slack").SetRequiredWith([]string{"config.0.slack.0.url"})
s.SchemaPath("config", "pagerduty").SetRequiredWith([]string{"config.0.pagerduty.0.integration_key"})
s.SchemaPath("config", "microsoft_teams").SetRequiredWith([]string{"config.0.microsoft_teams.0.url"})
s.SchemaPath("config", "generic_webhook").SetRequiredWith([]string{"config.0.generic_webhook.0.url"})
s.SchemaPath("config", "email").SetRequiredWith([]string{"config.0.email.0.addresses"})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we talk with the corresponding teams to mark URL and some other objects as required (remove omitempty)?

settings/resource_notification_destination.go Outdated Show resolved Hide resolved
settings/resource_notification_destination.go Show resolved Hide resolved
settings/resource_notification_destination_test.go Outdated Show resolved Hide resolved
Copy link
Contributor

@alexott alexott left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

API doesn't return sensitive values, so we need to handle it similarly to other resources (like, MLflow webhook or connections)

settings/resource_notification_destination.go Outdated Show resolved Hide resolved
@alexott
Copy link
Contributor

alexott commented Jul 25, 2024

For testing, you just need to provide a base URL; I was just using "https://hooks.slack.com/services/..." and it worked. But you need to have admin permissions on a workspace

Divyansh-db and others added 2 commits July 26, 2024 14:01
Co-authored-by: vuong-nguyen <44292934+nkvuong@users.noreply.github.com>
…ub.com:databricks/terraform-provider-databricks into divyansh_notification_destinations
@Divyansh-db
Copy link
Contributor Author

Divyansh-db commented Jul 29, 2024

I could not figure out on how to set the values of nested fields (fields inside the structs) directly through d.Set(...). I could not find similar integration tests which test setting nested fields (There is no integration test for mlflow_webhook as well). Used a way around to get things done
Done now

@Divyansh-db Divyansh-db requested a review from alexott July 29, 2024 11:29
@Divyansh-db Divyansh-db requested a review from nkvuong July 29, 2024 15:44
internal/acceptance/notification_destination_test.go Outdated Show resolved Hide resolved
settings/resource_notification_destination.go Outdated Show resolved Hide resolved
settings/resource_notification_destination.go Outdated Show resolved Hide resolved
settings/resource_notification_destination.go Outdated Show resolved Hide resolved
Comment on lines 78 to 84
if detectConfigTypeChange(d) {
err := Delete(ctx, d, w)
if err != nil {
return err
}
return Create(ctx, d, w)
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to API it should be possible to update it inline: https://docs.databricks.com/api/workspace/notificationdestinations/update - the main problem is that if we do delete/create, then ID of destination will change and this may break other integrations

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is true, but the update API does not allow to change the type of notification destination. I created an integration test TestAccConfigTypeChange for this exact puspose. If we do not delete and create a new resouce and try to update it instead, the update API response is bad request and states that you can't change config types using PATCH call

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to raise an internal ticket to update the OpenAPI spec to reflect this - right now, it's not documented. But it's better first to confirm that this is intended behavior.

Instead of implicitly recreating the resource, we may need to mark these blocks as force_new, so Terraform will delete/create automatically. We need to do this because Terraform needs to know that id is changing, and this change is propagated to the dependent resources. Otherwise, they could be broken after such an update.

We also need to document that behavior.

Copy link
Contributor Author

@Divyansh-db Divyansh-db Jul 31, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. which team should I contact for this.
    2) I will try to make fields force new rather than explicitly doing it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, is raising the ticket a blocking change for this PR?

@Divyansh-db Divyansh-db requested a review from alexott July 31, 2024 12:38
Copy link
Contributor

@alexott alexott left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code looks good, also tested it myself. Just few small comments

internal/acceptance/notification_destination_test.go Outdated Show resolved Hide resolved
docs/resources/notification_destination.md Outdated Show resolved Hide resolved
…ub.com:databricks/terraform-provider-databricks into divyansh_notification_destinations
@Divyansh-db Divyansh-db added this pull request to the merge queue Aug 1, 2024
Merged via the queue into main with commit 5258611 Aug 1, 2024
6 checks passed
@Divyansh-db Divyansh-db deleted the Divyansh-db/divyansh_notification_destinations branch August 1, 2024 13:56
mgyucht added a commit that referenced this pull request Aug 14, 2024
### New Features and Improvements

 * Added support for `cloudflare_api_token` in `databricks_storage_credential` resource ([#3835](#3835)).
 * Add `active` attribute to `databricks_user` data source ([#3733](#3733)).
 * Add `workspace_path` attribute to `databricks_notebook` resource and data source ([#3885](#3885)).
 * Mark attributes as sensitive in `databricks_mlflow_webhook` ([#3825](#3825)).
 * Added notification destination resource ([#3820](#3820)).

### Bug Fixes

 * Automatically assign `IS_OWNER` permission to sql warehouse if not specified ([#3829](#3829)).
 * Corrected kms arn format in `data_aws_unity_catalog_policy` ([#3823](#3823)).
 * Fix crash when destroying `databricks_compliance_security_profile_workspace_setting` ([#3883](#3883)).
 * Fixed read method of `databricks_entitlements` resource ([#3858](#3858)).
 * Retry cluster update on "INVALID_STATE" ([#3890](#3890)).
 * Save Pipeline resource to state in addition to spec ([#3869](#3869)).
 * Tolerate `databricks_workspace_conf` deletion failures ([#3737](#3737)).
 * Update Go SDK ([#3826](#3826)).
 * cluster key update for `databricks_sql_table` should not force new ([#3824](#3824)).
 * reading `databricks_metastore_assignment` when importing resource ([#3827](#3827)).

### Documentation

 * Add troubleshooting instructions for `databricks OAuth is not supported for this host` error ([#3815](#3815)).
 * Clarify setting of permissions for workspace objects ([#3884](#3884)).
 * Document missing task attributes in `databricks_job` resource ([#3817](#3817)).
 * Fixed documentation for `databricks_schemas` data source and `databricks_metastore_assignment` resource ([#3851](#3851)).
 * clarified `spot_bid_max_price` option for `databricks_cluster` ([#3830](#3830)).
 * marked `databricks_sql_dashboard` as legacy ([#3836](#3836)).

### Internal Changes

 * Refactor exporter: split huge files into smaller ones ([#3870](#3870)).
 * Refactored `client.ClientForHost` to use Go SDK method ([#3735](#3735)).
 * Revert "Rewriting DLT pipelines using SDK" ([#3838](#3838)).
 * Rewrite DLT pipelines using SDK ([#3839](#3839)).
 * Rewriting DLT pipelines using SDK ([#3792](#3792)).
 * Update Go SDK ([#3808](#3808)).
 * refactored `databricks_mws_permission_assignment` to Go SDK ([#3831](#3831)).

### Dependency Updates

 * Bump databricks-sdk-go to 0.44.0 ([#3896](#3896)).
 * Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0 ([#3775](#3775)).

### Exporter

 * Add retry on "Operation timed out" error ([#3897](#3897)).
 * Add support for Vector Search assets ([#3828](#3828)).
 * Add support for `databricks_notification_destination` ([#3861](#3861)).
 * Add support for `databricks_online_table` ([#3816](#3816)).
 * Don't export model serving endpoints with foundational models ([#3845](#3845)).
 * Fix generation of `autotermination_minutes = 0` ([#3881](#3881)).
 * Generate `databricks_workspace_binding` instead of legacy `databricks_catalog_workspace_binding` ([#3812](#3812)).
 * Ignore DLT pipelines deployed via DABs ([#3857](#3857)).
 * Improve exporting of `databricks_model_serving` ([#3821](#3821)).
 * Refactoring: remove legacy code ([#3864](#3864)).
mgyucht added a commit that referenced this pull request Aug 14, 2024
### New Features and Improvements

 * Added `databricks_notification_destination` resource ([#3820](#3820)).
 * Added support for `cloudflare_api_token` in `databricks_storage_credential` resource ([#3835](#3835)).
 * Add `active` attribute to `databricks_user` data source ([#3733](#3733)).
 * Add `workspace_path` attribute to `databricks_notebook` resource and data source ([#3885](#3885)).
 * Mark attributes as sensitive in `databricks_mlflow_webhook` ([#3825](#3825)).

### Bug Fixes

 * Automatically assign `IS_OWNER` permission to sql warehouse if not specified ([#3829](#3829)).
 * Corrected kms arn format in `data_aws_unity_catalog_policy` ([#3823](#3823)).
 * Fix crash when destroying `databricks_compliance_security_profile_workspace_setting` ([#3883](#3883)).
 * Fixed read method of `databricks_entitlements` resource ([#3858](#3858)).
 * Retry cluster update on "INVALID_STATE" ([#3890](#3890)).
 * Save Pipeline resource to state in addition to spec ([#3869](#3869)).
 * Tolerate `databricks_workspace_conf` deletion failures ([#3737](#3737)).
 * Update Go SDK ([#3826](#3826)).
 * cluster key update for `databricks_sql_table` should not force new ([#3824](#3824)).
 * reading `databricks_metastore_assignment` when importing resource ([#3827](#3827)).

### Documentation

 * Add troubleshooting instructions for `databricks OAuth is not supported for this host` error ([#3815](#3815)).
 * Clarify setting of permissions for workspace objects ([#3884](#3884)).
 * Document missing task attributes in `databricks_job` resource ([#3817](#3817)).
 * Fixed documentation for `databricks_schemas` data source and `databricks_metastore_assignment` resource ([#3851](#3851)).
 * clarified `spot_bid_max_price` option for `databricks_cluster` ([#3830](#3830)).
 * marked `databricks_sql_dashboard` as legacy ([#3836](#3836)).

### Internal Changes

 * Refactor exporter: split huge files into smaller ones ([#3870](#3870)).
 * Refactored `client.ClientForHost` to use Go SDK method ([#3735](#3735)).
 * Revert "Rewriting DLT pipelines using SDK" ([#3838](#3838)).
 * Rewrite DLT pipelines using SDK ([#3839](#3839)).
 * Rewriting DLT pipelines using SDK ([#3792](#3792)).
 * Update Go SDK ([#3808](#3808)).
 * refactored `databricks_mws_permission_assignment` to Go SDK ([#3831](#3831)).

### Dependency Updates

 * Bump databricks-sdk-go to 0.44.0 ([#3896](#3896)).
 * Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0 ([#3775](#3775)).

### Exporter

 * Add retry on "Operation timed out" error ([#3897](#3897)).
 * Add support for Vector Search assets ([#3828](#3828)).
 * Add support for `databricks_notification_destination` ([#3861](#3861)).
 * Add support for `databricks_online_table` ([#3816](#3816)).
 * Don't export model serving endpoints with foundational models ([#3845](#3845)).
 * Fix generation of `autotermination_minutes = 0` ([#3881](#3881)).
 * Generate `databricks_workspace_binding` instead of legacy `databricks_catalog_workspace_binding` ([#3812](#3812)).
 * Ignore DLT pipelines deployed via DABs ([#3857](#3857)).
 * Improve exporting of `databricks_model_serving` ([#3821](#3821)).
 * Refactoring: remove legacy code ([#3864](#3864)).
github-merge-queue bot pushed a commit that referenced this pull request Aug 15, 2024
### New Features and Improvements

* Added `databricks_notification_destination` resource
([#3820](#3820)).
* Added support for `cloudflare_api_token` in
`databricks_storage_credential` resource
([#3835](#3835)).
* Add `active` attribute to `databricks_user` data source
([#3733](#3733)).
* Add `workspace_path` attribute to `databricks_notebook` resource and
data source
([#3885](#3885)).
* Mark attributes as sensitive in `databricks_mlflow_webhook`
([#3825](#3825)).


### Bug Fixes

* Automatically assign `IS_OWNER` permission to sql warehouse if not
specified
([#3829](#3829)).
* Corrected kms arn format in `data_aws_unity_catalog_policy`
([#3823](#3823)).
* Fix crash when destroying
`databricks_compliance_security_profile_workspace_setting`
([#3883](#3883)).
* Fixed read method of `databricks_entitlements` resource
([#3858](#3858)).
* Retry cluster update on "INVALID_STATE"
([#3890](#3890)).
* Save Pipeline resource to state in addition to spec
([#3869](#3869)).
* Tolerate `databricks_workspace_conf` deletion failures
([#3737](#3737)).
* Update Go SDK
([#3826](#3826)).
* cluster key update for `databricks_sql_table` should not force new
([#3824](#3824)).
* reading `databricks_metastore_assignment` when importing resource
([#3827](#3827)).


### Documentation

* Add troubleshooting instructions for `databricks OAuth is not
supported for this host` error
([#3815](#3815)).
* Clarify setting of permissions for workspace objects
([#3884](#3884)).
* Document missing task attributes in `databricks_job` resource
([#3817](#3817)).
* Fixed documentation for `databricks_schemas` data source and
`databricks_metastore_assignment` resource
([#3851](#3851)).
* clarified `spot_bid_max_price` option for `databricks_cluster`
([#3830](#3830)).
* marked `databricks_sql_dashboard` as legacy
([#3836](#3836)).


### Internal Changes

* Refactor exporter: split huge files into smaller ones
([#3870](#3870)).
* Refactored `client.ClientForHost` to use Go SDK method
([#3735](#3735)).
* Revert "Rewriting DLT pipelines using SDK"
([#3838](#3838)).
* Rewrite DLT pipelines using SDK
([#3839](#3839)).
* Rewriting DLT pipelines using SDK
([#3792](#3792)).
* Update Go SDK
([#3808](#3808)).
* refactored `databricks_mws_permission_assignment` to Go SDK
([#3831](#3831)).


### Dependency Updates

* Bump databricks-sdk-go to 0.44.0
([#3896](#3896)).
* Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0
([#3775](#3775)).


### Exporter

* Add retry on "Operation timed out" error
([#3897](#3897)).
* Add support for Vector Search assets
([#3828](#3828)).
* Add support for `databricks_notification_destination`
([#3861](#3861)).
* Add support for `databricks_online_table`
([#3816](#3816)).
* Don't export model serving endpoints with foundational models
([#3845](#3845)).
* Fix generation of `autotermination_minutes = 0`
([#3881](#3881)).
* Generate `databricks_workspace_binding` instead of legacy
`databricks_catalog_workspace_binding`
([#3812](#3812)).
* Ignore DLT pipelines deployed via DABs
([#3857](#3857)).
* Improve exporting of `databricks_model_serving`
([#3821](#3821)).
* Refactoring: remove legacy code
([#3864](#3864)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants