Skip to content

Commit

Permalink
expand parameter documentation for databricks_sql_query resource
Browse files Browse the repository at this point in the history
  • Loading branch information
crankswagon committed Sep 14, 2022
1 parent af30395 commit 3061cfe
Show file tree
Hide file tree
Showing 6 changed files with 57 additions and 21 deletions.
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/feature-request.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,4 @@ Are there any other GitHub issues, whether open or closed, that are related to t
- #6017
-->
-->
1 change: 1 addition & 0 deletions docs/guides/unity-catalog-azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -250,6 +250,7 @@ resource "azurerm_role_assignment" "ext_storage" {
scope = azurerm_storage_account.ext_storage.id
role_definition_name = "Storage Blob Data Contributor"
principal_id = azapi_resource.ext_access_connector.identity[0].principal_id
}
```

Then create the [databricks_storage_credential](../resources/storage_credential.md) and [databricks_external_location](../resources/external_location.md) in Unity Catalog.
Expand Down
4 changes: 2 additions & 2 deletions docs/resources/group_role.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ resource "databricks_group" "my_group" {
}
resource "databricks_group_role" "my_group_role" {
group_id = databricks_group.my_group.id
role = "arn:aws:iam::000000000000:role/my-role"
group_id = databricks_group.my_group.id
role = "arn:aws:iam::000000000000:role/my-role"
}
```

Expand Down
30 changes: 15 additions & 15 deletions docs/resources/recipient.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,19 +16,19 @@ authenticate to the sharing server to access data. This is for when the recipien

```hcl
resource "random_password" "db2opensharecode" {
length = 16
special = true
length = 16
special = true
}
data "databricks_current_user" "current" {}
resource "databricks_recipient" "db2open" {
name = "${data.databricks_current_user.current.alphanumeric}-recipient"
comment = "made by terraform"
name = "${data.databricks_current_user.current.alphanumeric}-recipient"
comment = "made by terraform"
authentication_type = "TOKEN"
sharing_code = random_password.db2opensharecode.result
sharing_code = random_password.db2opensharecode.result
ip_access_list {
allowed_ip_addresses = [...] // .. fill in allowed IPv4 addresses (CIDR notation allowed)
allowed_ip_addresses = [] // .. fill in allowed IPv4 addresses (CIDR notation allowed)
}
}
```
Expand All @@ -46,16 +46,16 @@ resource "databricks_metastore" "recipient_metastore" {
name = "recipient"
storage_root = format("abfss://%s@%s.dfs.core.windows.net/",
azurerm_storage_account.unity_catalog.name,
azurerm_storage_container.unity_catalog.name)
delta_sharing_scope = "INTERNAL"
azurerm_storage_container.unity_catalog.name)
delta_sharing_scope = "INTERNAL"
delta_sharing_recipient_token_lifetime_in_seconds = "60000000"
force_destroy = true
force_destroy = true
}
resource "databricks_recipient" "db2db" {
name = "${data.databricks_current_user.current.alphanumeric}-recipient"
comment = "made by terraform"
authentication_type = "DATABRICKS"
name = "${data.databricks_current_user.current.alphanumeric}-recipient"
comment = "made by terraform"
authentication_type = "DATABRICKS"
data_recipient_global_metastore_id = databricks_metastore.recipient_metastore.global_metastore_id
}
```
Expand All @@ -75,9 +75,9 @@ The following arguments are required:
Only one `ip_access_list` blocks is allowed in a recipient. It conflicts with authentication type DATABRICKS.

```hcl
ip_access_list {
allowed_ip_addresses = ["0.0.0.0/0"]
}
ip_access_list {
allowed_ip_addresses = ["0.0.0.0/0"]
}
```

Arguments for the `ip_access_list` block are:
Expand Down
39 changes: 37 additions & 2 deletions docs/resources/sql_query.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,14 @@ A query may have one or more [visualizations](sql_visualization.md).
resource "databricks_sql_query" "q1" {
data_source_id = databricks_sql_endpoint.example.data_source_id
name = "My Query Name"
query = "SELECT {{ p1 }} AS p1, 2 as p2"
run_as_role = "viewer"
query = <<EOT
SELECT {{ p1 }} AS p1
WHERE 1=1
AND p2 in ({{ p2 }})
AND event_date > date '{{ p3 }}'
EOT
run_as_role = "viewer"
schedule {
continuous {
Expand All @@ -32,6 +38,31 @@ resource "databricks_sql_query" "q1" {
}
}
parameter {
name = "p2"
title = "Title for p2"
enum {
options = ["default", "foo", "bar"]
value = "default"
// passes to sql query as string `"foo", "bar"` if foo and bar are both selected in the front end
multiple {
prefix = "\""
suffix = "\""
separator = ","
}
}
}
parameter {
name = "p3"
title = "Title for p3"
date {
value = "2022-01-01"
}
}
tags = [
"t1",
"t2",
Expand Down Expand Up @@ -66,6 +97,10 @@ You can import a `databricks_sql_query` resource with ID like the following:
$ terraform import databricks_sql_query.this <query-id>
```

## Troubleshooting

In case you see `Error: cannot create sql query: Internal Server Error` during `terraform apply`; double check that you are using the correct [`data_source_id`](sql_endpoint.md)

## Related Resources

The following resources are often used in the same context:
Expand Down
2 changes: 1 addition & 1 deletion docs/resources/storage_credential.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ For Azure

```hcl
data "azurerm_resource_group" "this" {
name = "example-rg"
name = "example-rg"
}
resource "azapi_resource" "access_connector" {
Expand Down

0 comments on commit 3061cfe

Please sign in to comment.