subcategory |
---|
Workspace |
This resource allows you to manage Databricks Notebooks. You can also work with databricks_notebook and databricks_notebook_paths data sources.
You can declare Terraform-managed notebook by specifying source
attribute of corresponding local file. Only .scala
, .py
, .sql
, .r
, and .ipynb
extensions are supported, if you would like to omit the language
attribute.
data "databricks_current_user" "me" {
}
resource "databricks_notebook" "ddl" {
source = "${path.module}/DDLgen.py"
path = "${data.databricks_current_user.me.home}/AA/BB/CC"
}
You can also create a managed notebook with inline sources through content_base64
and language
attributes.
resource "databricks_notebook" "notebook" {
content_base64 = base64encode(<<-EOT
# created from ${abspath(path.module)}
display(spark.range(10))
EOT
)
path = "/Shared/Demo"
language = "PYTHON"
}
You can also manage Databricks Archives to import the whole folders of notebooks statically. Whenever you update the .dbc
file, the Terraform-managed notebook folder is removed and replaced with contents of the new .dbc
file. You are strongly advised to use .dbc
format only with source
attribute of the resource:
resource "databricks_notebook" "lesson" {
source = "${path.module}/IntroNotebooks.dbc"
path = "/Shared/Intro"
}
-> Note Notebook on Databricks workspace would only be changed, if Terraform stage did change. This means that any manual changes to managed notebook won't be overwritten by Terraform, if there's no local change to notebook sources. Notebooks are identified by their path, so changing notebook's name manually on the workspace and then applying Terraform state would result in creation of notebook from Terraform state.
The size of a notebook source code must not exceed a few megabytes. The following arguments are supported:
path
- (Required) The absolute path of the notebook or directory, beginning with "/", e.g. "/Demo".source
- Path to notebook in source code format on local filesystem. Conflicts withcontent_base64
.content_base64
- The base64-encoded notebook source code. Conflicts withsource
. Use ofcontent_base64
is discouraged, as it's increasing memory footprint of Terraform state and should only be used in exceptional circumstances, like creating a notebook with configuration properties for a data pipeline.language
- (required withcontent_base64
) One ofSCALA
,PYTHON
,SQL
,R
.
In addition to all arguments above, the following attributes are exported:
id
- Path of notebook on workspaceurl
- Routable URL of the notebookobject_id
- Unique identifier for a NOTEBOOK
- databricks_permissions can control which groups or individual users can access notebooks or folders.
The resource notebook can be imported using notebook path
$ terraform import databricks_notebook.this /path/to/notebook
The following resources are often used in the same context:
- End to end workspace management guide.
- databricks_cluster to create Databricks Clusters.
- databricks_directory to manage directories in Databricks Workpace.
- databricks_job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
- databricks_notebook data to export a notebook from Databricks Workspace.
- databricks_notebook_paths data to list notebooks in Databricks Workspace.
- databricks_pipeline to deploy Delta Live Tables.
- databricks_repo to manage Databricks Repos.
- databricks_secret to manage secrets in Databricks workspace.
- databricks_secret_acl to manage access to secrets in Databricks workspace.
- databricks_secret_scope to create secret scopes in Databricks workspace.
- databricks_user to manage users, that could be added to databricks_group within the workspace.
- databricks_user data to retrieve information about databricks_user.