subcategory |
---|
Security |
-> Note If you have a fully automated setup with workspaces created by databricks_mws_workspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
Retrieves information about databricks_user or databricks_service_principal, that is calling Databricks REST API. Might be useful in applying the same Terraform by different users in the shared workspace for testing purposes.
Create personalized databricks_job and databricks_notebook:
data "databricks_current_user" "me" {}
data "databricks_spark_version" "latest" {}
data "databricks_node_type" "smallest" {
local_disk = true
}
resource "databricks_notebook" "this" {
path = "${data.databricks_current_user.me.home}/Terraform"
language = "PYTHON"
content_base64 = base64encode(<<-EOT
# created from ${abspath(path.module)}
display(spark.range(10))
EOT
)
}
resource "databricks_job" "this" {
name = "Terraform Demo (${data.databricks_current_user.me.alphanumeric})"
new_cluster {
num_workers = 1
spark_version = data.databricks_spark_version.latest.id
node_type_id = data.databricks_node_type.smallest.id
}
notebook_task {
notebook_path = databricks_notebook.this.path
}
}
output "notebook_url" {
value = databricks_notebook.this.url
}
output "job_url" {
value = databricks_job.this.url
}
Data source exposes the following attributes:
id
- The id of the calling user.external_id
- ID of the user in an external identity provider.user_name
- Name of the user, e.g.mr.foo@example.com
. If the currently logged-in identity is a service principal, returns the application ID, e.g.11111111-2222-3333-4444-555666777888
home
- Home folder of the user, e.g./Users/mr.foo@example.com
.repos
- Personal Repos location of the user, e.g./Repos/mr.foo@example.com
.alphanumeric
- Alphanumeric representation of user local name. e.g.mr_foo
.workspace_url
- URL of the current Databricks workspace.
The following resources are used in the same context:
- End to end workspace management guide
- databricks_directory to manage directories in Databricks Workpace.
- databricks_notebook to manage Databricks Notebooks.
- databricks_repo to manage Databricks Repos.