Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document enable_serverless_compute API changes in databricks_sql_endpoint resource #2137

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
85b5d8c
fix "sql endpoint" in English descriptions (not the code itself) to s…
jxbell Mar 20, 2023
1ce8e73
change to lowercase serverless, per Marketing team for docs usage in …
jxbell Mar 20, 2023
6d3360d
Remove GlobalConfig configuring of workspace config for serverless.
jxbell Mar 21, 2023
19dc099
Make more accurate enablement info for AWS only (not "enable for work…
jxbell Mar 21, 2023
58f4d7d
Update AWS specific instructions when we enable "auto enablement at a…
jxbell Mar 21, 2023
4768489
update some headings to be consistent with Databricks heading style c…
jxbell Mar 21, 2023
05970e5
For Azure, specify how default value works: For Azure, if serverless …
jxbell Mar 21, 2023
ed46230
In `exporter_test.go` remove EnableServerlessCompute from sql.GlobalC…
jxbell Mar 21, 2023
1e67992
Merge branch 'master' into terraform-aws-serverless-sql-api-change-DO…
jxbell Mar 21, 2023
a397579
fix "sql endpoint" in English descriptions (not the code itself) to s…
jxbell Mar 20, 2023
ba28f68
change to lowercase serverless, per Marketing team for docs usage in …
jxbell Mar 20, 2023
497fbcf
Remove GlobalConfig configuring of workspace config for serverless.
jxbell Mar 21, 2023
a706b27
Make more accurate enablement info for AWS only (not "enable for work…
jxbell Mar 21, 2023
9705cc6
Update AWS specific instructions when we enable "auto enablement at a…
jxbell Mar 21, 2023
9dcbb4f
update some headings to be consistent with Databricks heading style c…
jxbell Mar 21, 2023
f293273
For Azure, specify how default value works: For Azure, if serverless …
jxbell Mar 21, 2023
a6a4f6e
In `exporter_test.go` remove EnableServerlessCompute from sql.GlobalC…
jxbell Mar 21, 2023
2e32b7a
Merge branch 'terraform-aws-serverless-sql-api-change-DOC-6788' of ht…
jxbell Mar 21, 2023
8c4cdbe
REVERT CHANGES TO EXPORTER AND SQL FOLDERS PER REQUEST OF SERGE SMERT…
jxbell Mar 21, 2023
456d456
Update CHANGELOG.md
jxbell Mar 21, 2023
bbda59b
Update CHANGELOG.md
jxbell Mar 21, 2023
f025a39
Update docs/data-sources/sql_warehouse.md
jxbell Mar 21, 2023
c8cbcd4
Update docs/resources/sql_endpoint.md
jxbell Mar 21, 2023
1671596
Update docs/data-sources/sql_warehouse.md
jxbell Mar 21, 2023
c40648c
add warehouse type to one place missing it -- and also fix AWS link i…
jxbell Mar 21, 2023
8f16412
changed April 6 to April 30 per Roopam
jxbell Mar 21, 2023
9480803
[DOC-8484] LEGAL UPDATE TO TERMS OF USE DISCUSSION TO REMOVE OCTOBER …
jxbell Mar 24, 2023
88cf652
Merge branch 'master' into terraform-aws-serverless-sql-api-change-DO…
nfx Mar 24, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 11 additions & 6 deletions docs/data-sources/sql_warehouse.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ subcategory: "Databricks SQL"

Retrieves information about a [databricks_sql_warehouse](../resources/sql_warehouse.md) using its id. This could be retrieved programmatically using [databricks_sql_warehouses](../data-sources/sql_warehouses.md) data source.

## Example Usage
## Example usage

Retrieve attributes of each SQL warehouses in a workspace

Expand All @@ -22,11 +22,11 @@ data "databricks_sql_warehouse" "all" {

```

## Argument Reference
## Argument reference

* `id` - (Required) The id of the SQL warehouse
* `id` - (Required) The ID of the SQL warehouse

## Attribute Reference
## Attribute reference

This data source exports the following attributes:

Expand All @@ -38,14 +38,19 @@ This data source exports the following attributes:
* `tags` - Databricks tags all warehouse resources with these tags.
* `spot_instance_policy` - The spot policy to use for allocating instances to clusters: `COST_OPTIMIZED` or `RELIABILITY_OPTIMIZED`.
* `enable_photon` - Whether to enable [Photon](https://databricks.com/product/delta-engine).
* `enable_serverless_compute` - Whether this SQL warehouse is a Serverless warehouse. To use a Serverless SQL warehouse, you must enable Serverless SQL warehouses for the workspace.
* `enable_serverless_compute` - Whether this SQL warehouse is a serverless SQL warehouse. If this value is true explicitly or through the default, you **must** also set `warehouse_type` field to `pro`.

- **For AWS**: If your account needs updated [terms of use](https://docs.databricks.com/sql/admin/serverless.html#accept-terms), workspace admins are prompted in the Databricks SQL UI. A workspace must meet the [requirements](https://docs.databricks.com/sql/admin/serverless.html#requirements) and might require an update its instance profile role to [add a trust relationship](https://docs.databricks.com/sql/admin/serverless.html#aws-instance-profile-setup).

- **For Azure**, you must [enable your workspace for serverless SQL warehouse](https://learn.microsoft.com/azure/databricks/sql/admin/serverless).
* `warehouse_type` - SQL warehouse type. See for [AWS](https://docs.databricks.com/sql/index.html#warehouse-types) or [Azure](https://learn.microsoft.com/azure/databricks/sql/#warehouse-types). Set to `PRO` or `CLASSIC` (default). If you want to use serverless compute, you must set to `PRO` and **also** set the field `enable_serverless_compute` to `true`.
* `channel` block, consisting of following fields:
* `name` - Name of the Databricks SQL release channel. Possible values are: `CHANNEL_NAME_PREVIEW` and `CHANNEL_NAME_CURRENT`. Default is `CHANNEL_NAME_CURRENT`.
* `jdbc_url` - JDBC connection string.
* `odbc_params` - ODBC connection params: `odbc_params.hostname`, `odbc_params.path`, `odbc_params.protocol`, and `odbc_params.port`.
* `data_source_id` - ID of the data source for this warehouse. This is used to bind an Databricks SQL query to an warehouse.

## Related Resources
## Related resources

The following resources are often used in the same context:

Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Databricks SQL
* Create [databricks_sql_endpoint](resources/sql_endpoint.md) controlled by [databricks_permissions](resources/permissions.md).
* Manage [queries](resources/sql_query.md) and their [visualizations](resources/sql_visualization.md).
* Manage [dashboards](resources/sql_dashboard.md) and their [widgets](resources/sql_widget.md).
* Provide [global configuration for all SQL Endpoints](docs/resources/sql_global_config.md)
* Provide [global configuration for all SQL warehouses](docs/resources/sql_global_config.md)

MLFlow

Expand Down
6 changes: 3 additions & 3 deletions docs/resources/permissions.md
Original file line number Diff line number Diff line change
Expand Up @@ -527,9 +527,9 @@ resource "databricks_permissions" "token_usage" {
}
```

## SQL Endpoint Usage
## SQL warehouse usage

[SQL endpoints](https://docs.databricks.com/sql/user/security/access-control/sql-endpoint-acl.html) have two possible permissions: `CAN_USE` and `CAN_MANAGE`:
[SQL warehouses](https://docs.databricks.com/sql/user/security/access-control/sql-endpoint-acl.html) have two possible permissions: `CAN_USE` and `CAN_MANAGE`:

```hcl
data "databricks_current_user" "me" {}
Expand Down Expand Up @@ -693,7 +693,7 @@ Exactly one of the following arguments is required:
- `experiment_id` - [MLflow experiment](mlflow_experiment.md) id
- `registered_model_id` - [MLflow registered model](mlflow_model.md) id
- `authorization` - either [`tokens`](https://docs.databricks.com/administration-guide/access-control/tokens.html) or [`passwords`](https://docs.databricks.com/administration-guide/users-groups/single-sign-on/index.html#configure-password-permission).
- `sql_endpoint_id` - [SQL endpoint](sql_endpoint.md) id
- `sql_endpoint_id` - [SQL warehouse](sql_endpoint.md) id
- `sql_dashboard_id` - [SQL dashboard](sql_dashboard.md) id
- `sql_query_id` - [SQL query](sql_query.md) id
- `sql_alert_id` - [SQL alert](https://docs.databricks.com/sql/user/security/access-control/alert-acl.html) id
Expand Down
32 changes: 18 additions & 14 deletions docs/resources/sql_endpoint.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ subcategory: "Databricks SQL"
---
# databricks_sql_endpoint Resource

This resource is used to manage [Databricks SQL Endpoints](https://docs.databricks.com/sql/admin/sql-endpoints.html). To create [SQL endpoints](https://docs.databricks.com/sql/get-started/concepts.html) you must have `databricks_sql_access` on your [databricks_group](group.md#databricks_sql_access) or [databricks_user](user.md#databricks_sql_access).
This resource is used to manage [Databricks SQL warehouses](https://docs.databricks.com/sql/admin/sql-endpoints.html). To create [SQL warehouses](https://docs.databricks.com/sql/get-started/concepts.html) you must have `databricks_sql_access` on your [databricks_group](group.md#databricks_sql_access) or [databricks_user](user.md#databricks_sql_access).

## Example usage

Expand All @@ -24,39 +24,43 @@ resource "databricks_sql_endpoint" "this" {
}
```

## Argument Reference
## Argument reference

The following arguments are supported:

* `name` - (Required) Name of the SQL endpoint. Must be unique.
* `name` - (Required) Name of the SQL warehouse. Must be unique.
* `cluster_size` - (Required) The size of the clusters allocated to the endpoint: "2X-Small", "X-Small", "Small", "Medium", "Large", "X-Large", "2X-Large", "3X-Large", "4X-Large".
* `min_num_clusters` - Minimum number of clusters available when a SQL endpoint is running. The default is `1`.
* `max_num_clusters` - Maximum number of clusters available when a SQL endpoint is running. This field is required. If multi-cluster load balancing is not enabled, this is default to `1`.
* `auto_stop_mins` - Time in minutes until an idle SQL endpoint terminates all clusters and stops. This field is optional. The default is 120, set to 0 to disable the auto stop.
* `min_num_clusters` - Minimum number of clusters available when a SQL warehouse is running. The default is `1`.
* `max_num_clusters` - Maximum number of clusters available when a SQL warehouse is running. This field is required. If multi-cluster load balancing is not enabled, this is default to `1`.
* `auto_stop_mins` - Time in minutes until an idle SQL warehouse terminates all clusters and stops. This field is optional. The default is 120, set to 0 to disable the auto stop.
* `tags` - Databricks tags all endpoint resources with these tags.
* `spot_instance_policy` - The spot policy to use for allocating instances to clusters: `COST_OPTIMIZED` or `RELIABILITY_OPTIMIZED`. This field is optional. Default is `COST_OPTIMIZED`.
* `enable_photon` - Whether to enable [Photon](https://databricks.com/product/delta-engine). This field is optional and is enabled by default.
* `enable_serverless_compute` - Whether this SQL endpoint is a Serverless endpoint. To use a Serverless SQL endpoint, you must enable Serverless SQL endpoints for the workspace.
* `enable_serverless_compute` - Whether this SQL warehouse is a serverless endpoint. If this value is true explicitly or through the default, you **must** also set `warehouse_type` field to `pro`.

- **For AWS**, Databricks strongly recommends that you always explicitly set this field. If omitted, the default is false for most workspaces. However, if this workspace used the SQL Warehouses API to create a warehouse between September 1, 2022 and April 30, 2023, the default remains the previous behavior which is default to true if the workspace is enabled for serverless and fits the requirements for serverless SQL warehouses. To avoid ambiguity, especially for organizations with many workspaces, Databricks recommends that you always set this field. If your account needs updated [terms of use](https://docs.databricks.com/sql/admin/serverless.html#accept-terms), workspace admins are prompted in the Databricks SQL UI. A workspace must meet the [requirements](https://docs.databricks.com/sql/admin/serverless.html#requirements) and might require an update its instance profile role to [add a trust relationship](https://docs.databricks.com/sql/admin/serverless.html#aws-instance-profile-setup).

- **For Azure**, you must [enable your workspace for serverless SQL warehouse](https://learn.microsoft.com/azure/databricks/sql/admin/serverless). For Azure, if serverless SQL warehouses are disabled for the workspace, the default is `false`. If serverless SQL warehouses are enabled for the workspace, the default is `true`.
* `channel` block, consisting of following fields:
* `name` - Name of the Databricks SQL release channel. Possible values are: `CHANNEL_NAME_PREVIEW` and `CHANNEL_NAME_CURRENT`. Default is `CHANNEL_NAME_CURRENT`.
* `warehouse_type` - [SQL Warehouse Type](https://docs.databricks.com/sql/admin/sql-endpoints.html#switch-the-sql-warehouse-type-pro-classic-or-serverless): `PRO` or `CLASSIC` (default). If Serverless SQL is enabled, you can only specify `PRO`.
## Attribute Reference
* `warehouse_type` - SQL warehouse type. See for [AWS](https://docs.databricks.com/sql/admin/sql-endpoints.html#switch-the-sql-warehouse-type-pro-classic-or-serverless) or [Azure](https://docs.databricks.com/sql/admin/sql-endpoints.html#switch-the-sql-warehouse-type-pro-classic-or-serverless). Set to `PRO` or `CLASSIC` (default). If you want to use serverless compute, you must set to `PRO` and **also** set the field `enable_serverless_compute` to `true`.

## Attribute reference

In addition to all arguments above, the following attributes are exported:

* `jdbc_url` - JDBC connection string.
* `odbc_params` - ODBC connection params: `odbc_params.hostname`, `odbc_params.path`, `odbc_params.protocol`, and `odbc_params.port`.
* `data_source_id` - ID of the data source for this endpoint. This is used to bind an Databricks SQL query to an endpoint.

## Access Control
## Access control

* [databricks_permissions](permissions.md#Job-Endpoint-usage) can control which groups or individual users can *Can Use* or *Can Manage* SQL endpoints.
* [databricks_permissions](permissions.md#Job-Endpoint-usage) can control which groups or individual users can *Can Use* or *Can Manage* SQL warehouses.
* `databricks_sql_access` on [databricks_group](group.md#databricks_sql_access) or [databricks_user](user.md#databricks_sql_access).

## Timeouts

The `timeouts` block allows you to specify `create` timeouts. It usually takes 10-20 minutes to provision a Databricks SQL endpoint.
The `timeouts` block allows you to specify `create` timeouts. It usually takes 10-20 minutes to provision a Databricks SQL warehouse.

```hcl
timeouts {
Expand All @@ -72,7 +76,7 @@ You can import a `databricks_sql_endpoint` resource with ID like the following:
$ terraform import databricks_sql_endpoint.this <endpoint-id>
```

## Related Resources
## Related resources

The following resources are often used in the same context:

Expand Down
1 change: 0 additions & 1 deletion docs/resources/sql_global_config.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,6 @@ The following arguments are supported (see [documentation](https://docs.databric

* `security_policy` (Optional, String) - The policy for controlling access to datasets. Default value: `DATA_ACCESS_CONTROL`, consult documentation for list of possible values
* `data_access_config` (Optional, Map) - Data access configuration for [databricks_sql_endpoint](sql_endpoint.md), such as configuration for an external Hive metastore, Hadoop Filesystem configuration, etc. Please note that the list of supported configuration properties is limited, so refer to the [documentation](https://docs.databricks.com/sql/admin/data-access-configuration.html#supported-properties) for a full list. Apply will fail if you're specifying not permitted configuration.
* `enable_serverless_compute` (optional, Boolean) - Allows the possibility to create Serverless SQL warehouses. Default value: false.
* `instance_profile_arn` (Optional, String) - [databricks_instance_profile](instance_profile.md) used to access storage from [databricks_sql_endpoint](sql_endpoint.md). Please note that this parameter is only for AWS, and will generate an error if used on other clouds.
* `sql_config_params` (Optional, Map) - SQL Configuration Parameters let you override the default behavior for all sessions with all endpoints.

Expand Down