-
Notifications
You must be signed in to change notification settings - Fork 396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] Fix databricks_cluster_pluginframework data source #4097
Conversation
Why can't we do such a conversion automatically? It makes it harder for people to port existing code - I've got the same problem with the registered model data source yesterday... |
That's a good question @alexott. Unfortunately we have to do it this way because the issue arises when calling the The error we're seeing is what happens when calling Note that this conversion is handled automatically by the GoToTfSDk/TfSDKToGo converters. |
### New Features and Improvements * Add `databricks_registered_model` data source ([#4033](#4033)). * Add data source `databricks_notification_destinations` ([#4087](#4087)). ### Bug Fixes * Fix databricks_cluster_pluginframework data source ([#4097](#4097)). * Mark unity_catalog_provisioning_state as ReadOnly ([#4116](#4116)). * Tolerate invalid keys in `databricks_workspace_conf` ([#4102](#4102)). * force send `read_only` in `databricks_external_location` when it's changed ([#4067](#4067)). * force send `read_only` in `databricks_storage_credential` when it's changed ([#4083](#4083)). ### Documentation * Document `budget_policy_id` in `databricks_pipeline` and `databricks_job` ([#4110](#4110)). * Reformat code examples in documentation ([#4081](#4081)). * Update documentation for `databricks_model_serving` ([#4115](#4115)). * Updates to resource examples ([#4093](#4093)). ### Internal Changes * Add maxItem=1 validator for object types in plugin framework schema ([#4094](#4094)). * Fix acceptance test for `databricks_registered_model` data source ([#4105](#4105)). * Generate Effective Fields ([#4057](#4057)). * Generate Effective Fields ([#4112](#4112)). * Set SDK used in the useragent in context ([#4092](#4092)). * Support adding context in resources and data sources ([#4085](#4085)). * Update plugin framework schema to use ListNestedBlocks ([#4040](#4040)).
### New Features and Improvements * Add `databricks_registered_model` data source ([#4033](#4033)). * Add data source `databricks_notification_destinations` ([#4087](#4087)). ### Bug Fixes * Fix databricks_cluster_pluginframework data source ([#4097](#4097)). * Mark unity_catalog_provisioning_state as ReadOnly ([#4116](#4116)). * Tolerate invalid keys in `databricks_workspace_conf` ([#4102](#4102)). * force send `read_only` in `databricks_external_location` when it's changed ([#4067](#4067)). * force send `read_only` in `databricks_storage_credential` when it's changed ([#4083](#4083)). ### Documentation * Document `budget_policy_id` in `databricks_pipeline` and `databricks_job` ([#4110](#4110)). * Reformat code examples in documentation ([#4081](#4081)). * Update documentation for `databricks_model_serving` ([#4115](#4115)). * Updates to resource examples ([#4093](#4093)). ### Internal Changes * Add maxItem=1 validator for object types in plugin framework schema ([#4094](#4094)). * Fix acceptance test for `databricks_registered_model` data source ([#4105](#4105)). * Generate Effective Fields ([#4057](#4057)). * Generate Effective Fields ([#4112](#4112)). * Set SDK used in the useragent in context ([#4092](#4092)). * Support adding context in resources and data sources ([#4085](#4085)). * Update plugin framework schema to use ListNestedBlocks ([#4040](#4040)).
Changes
After bac842d , the autogenerated structures used for interacting with state/config/plan always use Lists for nested structures, even if there is only ever at most one, for compatibility with older versions of the TF provider. The cluster data source on the plugin framework is handwritten and also needs to be converted to use lists instead of a pointer as well.
Tests
Ran the
TestAccDataSourceClusterByID
andTestAccDataSourceClusterByName
integration tests, which now pass.make test
run locallydocs/
folderinternal/acceptance