subcategory |
---|
Unity Catalog |
-> Note This article refers to the privileges and inheritance model in Privilege Model version 1.0. If you created your metastore during the public preview (before August 25, 2022), you can upgrade to Privilege Model version 1.0 following Upgrade to privilege inheritance
-> Notes Unity Catalog APIs are accessible via workspace-level APIs. This design may change in the future. Account-level principal grants can be assigned with any valid workspace as the Unity Catalog is decoupled from specific workspaces. More information in the official documentation.
In Unity Catalog all users initially have no access to data. Only Metastore Admins can create objects and can grant/revoke access on individual objects to users and groups. Every securable object in Unity Catalog has an owner. The owner can be any account-level user or group, called principals in general. The principal that creates an object becomes its owner. Owners receive ALL_PRIVILEGES
on the securable object (e.g., SELECT
and MODIFY
on a table), as well as the permission to grant privileges to other principals.
Securable objects are hierarchical and privileges are inherited downward. The highest level object that privileges are inherited from is the catalog. This means that granting a privilege on a catalog or schema automatically grants the privilege to all current and future objects within the catalog or schema. Privileges that are granted on a metastore are not inherited.
Every databricks_grants
resource must have exactly one securable identifier and one or more grant
blocks with the following arguments:
principal
- User name, group name or service principal application ID.privileges
- One or more privileges that are specific to a securable type.
The securable objects are:
METASTORE
: The top-level container for metadata. Each metastore exposes a three-level namespace (catalog
.schema
.table
) that organizes your data.CATALOG
: The first layer of the object hierarchy, used to organize your data assets.SCHEMA
: Also known as databases, schemas are the second layer of the object hierarchy and contain tables and views.TABLE
: The lowest level in the object hierarchy, tables can be external (stored in external locations in your cloud storage of choice) or managed tables (stored in a storage container in your cloud storage that you create expressly for UC).VIEW
: A read-only object created from one or more tables that is contained within a schema.EXTERNAL LOCATION
: An object that contains a reference to a storage credential and a cloud storage path that is contained within a metatore.STORAGE CREDENTIAL
: An object that encapsulates a long-term cloud credential that provides access to cloud storage that is contained within a metatore.SHARE
: A logical grouping for the tables you intend to share using Delta Sharing. A share is contained within a Unity Catalog metastore.
Terraform will handle any configuration drift on every terraform apply
run, even when grants are changed outside of Terraform state.
It is required to define all permissions for a securable in a single resource, otherwise Terraform cannot guarantee config drift prevention.
Unlike the SQL specification, all privileges to be written with underscore instead of space, e.g. CREATE_TABLE
and not CREATE TABLE
. Below summarizes which privilege types apply to each securable object in the catalog:
You can grant CREATE_CATALOG
, CREATE_EXTERNAL_LOCATION
, CREATE_SHARE
, CREATE_RECIPIENT
and CREATE_PROVIDER
privileges to databricks_metastore id specified in metastore
attribute.
resource "databricks_grants" "sandbox" {
metastore = databricks_metastore.this.id
grant {
principal = "Data Engineers"
privileges = ["CREATE_CATALOG", "CREATE_EXTERNAL_LOCATION"]
}
grant {
principal = "Data Sharer"
privileges = ["CREATE_RECIPIENT", "CREATE_SHARE"]
}
}
You can grant ALL_PRIVILEGES
, CREATE_SCHEMA
, USE_CATALOG
privileges to databricks_catalog specified in the catalog
attribute. You can also grant CREATE_FUNCTION
, CREATE_TABLE
, EXECUTE
, MODIFY
, SELECT
and USE_SCHEMA
at the catalog level to apply them to the pertinent current and future securable objects within the catalog:
resource "databricks_catalog" "sandbox" {
metastore_id = databricks_metastore.this.id
name = "sandbox"
comment = "this catalog is managed by terraform"
properties = {
purpose = "testing"
}
}
resource "databricks_grants" "sandbox" {
catalog = databricks_catalog.sandbox.name
grant {
principal = "Data Scientists"
privileges = ["USE_CATALOG", "USE_SCHEMA", "CREATE_TABLE", "SELECT"]
}
grant {
principal = "Data Engineers"
privileges = ["USE_CATALOG", "USE_SCHEMA", "CREATE_SCHEMA", "CREATE_TABLE", "MODIFY"]
}
grant {
principal = "Data Analyst"
privileges = ["USE_CATALOG", "USE_SCHEMA", "SELECT"]
}
}
You can grant ALL_PRIVILEGES
, CREATE_FUNCTION
, CREATE_TABLE
, and USE_SCHEMA
privileges to catalog.schema
specified in the schema
attribute. You can also grant EXECUTE
, MODIFY
and SELECT
at the schema level to apply them to the pertinent current and future securable objects within the schema:
resource "databricks_schema" "things" {
catalog_name = databricks_catalog.sandbox.id
name = "things"
comment = "this schema is managed by terraform"
properties = {
kind = "various"
}
}
resource "databricks_grants" "things" {
schema = databricks_schema.things.id
grant {
principal = "Data Engineers"
privileges = ["USE_SCHEMA", "MODIFY"]
}
}
You can grant ALL_PRIVILEGES
, SELECT
and MODIFY
privileges to catalog.schema.table
specified in the table
attribute.
resource "databricks_grants" "customers" {
table = "main.reporting.customers"
grant {
principal = "Data Engineers"
privileges = ["MODIFY", "SELECT"]
}
grant {
principal = "Data Analysts"
privileges = ["SELECT"]
}
}
You can also apply grants dynamically with databricks_tables data resource:
data "databricks_tables" "things" {
catalog_name = "sandbox"
schema_name = "things"
}
resource "databricks_grants" "things" {
for_each = data.databricks_tables.things.ids
table = each.value
grant {
principal = "sensitive"
privileges = ["SELECT", "MODIFY"]
}
}
You can grant ALL_PRIVILEGES
and SELECT
privileges to catalog.schema.view
specified in table
attribute.
resource "databricks_grants" "customer360" {
table = "main.reporting.customer360"
grant {
principal = "Data Analysts"
privileges = ["SELECT"]
}
}
You can also apply grants dynamically with databricks_views data resource:
data "databricks_views" "customers" {
catalog_name = "main"
schema_name = "customers"
}
resource "databricks_grants" "customers" {
for_each = data.databricks_views.customers.ids
table = each.value
grant {
principal = "sensitive"
privileges = ["SELECT", "MODIFY"]
}
}
You can grant ALL_PRIVILEGES
, CREATE_EXTERNAL_TABLE
, READ_FILES
and WRITE_FILES
privileges to databricks_storage_credential id specified in storage_credential
attribute:
resource "databricks_storage_credential" "external" {
name = aws_iam_role.external_data_access.name
aws_iam_role {
role_arn = aws_iam_role.external_data_access.arn
}
comment = "Managed by TF"
}
resource "databricks_grants" "external_creds" {
storage_credential = databricks_storage_credential.external.id
grant {
principal = "Data Engineers"
privileges = ["CREATE_TABLE"]
}
}
You can grant ALL_PRIVILEGES
, CREATE_EXTERNAL_TABLE
, CREATE_MANAGED_STORAGE
, READ_FILES
and WRITE_FILES
privileges to databricks_external_location id specified in external_location
attribute:
resource "databricks_external_location" "some" {
name = "external"
url = "s3://${aws_s3_bucket.external.id}/some"
credential_name = databricks_storage_credential.external.id
comment = "Managed by TF"
}
resource "databricks_grants" "some" {
external_location = databricks_external_location.some.id
grant {
principal = "Data Engineers"
privileges = ["CREATE_TABLE", "READ_FILES"]
}
}
You can grant SELECT
to databricks_recipient on databricks_share name specified in share
attribute:
resource "databricks_share" "some" {
name = "my_share"
}
resource "databricks_recipient" "some" {
name = "my_recipient"
}
resource "databricks_grants" "some" {
share = databricks_share.some.name
grant {
principal = databricks_recipient.some.name
privileges = ["SELECT"]
}
}
You can control Databricks General Permissions through databricks_permissions resource.