Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE] DATABRICKS_CONFIG_FILE environment variable doesn't get used #420

Closed
thaiphv opened this issue Nov 27, 2020 · 2 comments · Fixed by #421
Closed

[ISSUE] DATABRICKS_CONFIG_FILE environment variable doesn't get used #420

thaiphv opened this issue Nov 27, 2020 · 2 comments · Fixed by #421

Comments

@thaiphv
Copy link
Contributor

thaiphv commented Nov 27, 2020

Terraform Version

$ terraform -version
Terraform v0.13.5
+ provider registry.terraform.io/databrickslabs/databricks v0.2.8
+ provider registry.terraform.io/hashicorp/aws v3.16.0
+ provider registry.terraform.io/hashicorp/external v2.0.0

Affected Resource(s)

This affects how the Databricks client gets authenticated.

Environment variable names

$ env | sort | grep -E 'DATABRICKS|AWS|AZURE|ARM|TEST' | awk -F= '{print $1}'
DATABRICKS_CONFIG_FILE
DATABRICKS_CONFIG_PROFILE

Terraform Configuration Files

terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "~> 3.0"
    }
    databricks = {
      source  = "databrickslabs/databricks"
      version = "~> 0.2.8"
    }
  }

  required_version = "~> 0.13.5"
}

provider "databricks" {
  profile = "dev"
}

resource "databricks_dbfs_file" "init_script" {
  content              = base64encode("Init Script")
  content_b64_md5      = md5(base64encode("Init Script"))
  path                 = "/databricks/scripts/script.sh"
  overwrite            = true
  mkdirs               = true
  validate_remote_file = true
}

Debug Output

$ TF_LOG=DEBUG terraform plan 2>&1 | grep databricks | sed -E 's/^.* plugin[^:]+: (.*)$/\1/'
2020-11-27T15:33:31.566+1100 [DEBUG] plugin: starting plugin: path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 args=[.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8]
path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 pid=79748
2020-11-27T15:33:31.569+1100 [DEBUG] plugin: waiting for RPC address: path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8
Databricks Terraform Provider (experimental)

Version 0.2.8

https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs

configuring server automatic mTLS: timestamp=2020-11-27T15:33:31.582+1100
address=/var/folders/f6/kfyx6q891cxd067m2pf9d7qm0000gn/T/plugin909495059 network=unix timestamp=2020-11-27T15:33:31.609+1100
path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 pid=79748
2020/11/27 15:33:31 [DEBUG] ProviderTransformer: "databricks_dbfs_file.init_script" (*terraform.NodeValidatableResource) needs provider["registry.terraform.io/databrickslabs/databricks"]
2020/11/27 15:33:31 [DEBUG] ReferenceTransformer: "databricks_dbfs_file.init_script" references: []
2020/11/27 15:33:31 [DEBUG] ReferenceTransformer: "provider[\"registry.terraform.io/databrickslabs/databricks\"]" references: []
2020-11-27T15:33:31.936+1100 [DEBUG] plugin: starting plugin: path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 args=[.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8]
path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 pid=79750
2020-11-27T15:33:31.940+1100 [DEBUG] plugin: waiting for RPC address: path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8
Databricks Terraform Provider (experimental)

Version 0.2.8

https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs

configuring server automatic mTLS: timestamp=2020-11-27T15:33:31.953+1100
address=/var/folders/f6/kfyx6q891cxd067m2pf9d7qm0000gn/T/plugin708641557 network=unix timestamp=2020-11-27T15:33:31.980+1100
path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 pid=79750
2020/11/27 15:33:32 [DEBUG] ProviderTransformer: "databricks_dbfs_file.init_script (expand)" (*terraform.nodeExpandRefreshableManagedResource) needs provider["registry.terraform.io/databrickslabs/databricks"]
2020/11/27 15:33:32 [DEBUG] ReferenceTransformer: "provider[\"registry.terraform.io/databrickslabs/databricks\"]" references: []
2020/11/27 15:33:32 [DEBUG] ReferenceTransformer: "databricks_dbfs_file.init_script (expand)" references: []
2020-11-27T15:33:32.072+1100 [DEBUG] plugin: starting plugin: path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 args=[.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8]
path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 pid=79751
2020-11-27T15:33:32.076+1100 [DEBUG] plugin: waiting for RPC address: path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8
Databricks Terraform Provider (experimental)

Version 0.2.8

https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs

configuring server automatic mTLS: timestamp=2020-11-27T15:33:32.089+1100
address=/var/folders/f6/kfyx6q891cxd067m2pf9d7qm0000gn/T/plugin880391322 network=unix timestamp=2020-11-27T15:33:32.116+1100
databricks_dbfs_file.init_script: Refreshing state... [id=/databricks/scripts/script.sh]
[INFO] ~/.databrickscfg not found on current host
3. azure_databricks_workspace_id + AZ CLI authentication.
4. azure_databricks_workspace_id + azure_client_id + azure_client_secret + azure_tenant_id for Azure Service Principal authentication.
5. Run `databricks configure --token` that will create ~/.databrickscfg file.
Please check https://github.com/databrickslabs/terraform-provider-databricks/blob/master/docs/index.md#authentication for details
3. azure_databricks_workspace_id + AZ CLI authentication.
4. azure_databricks_workspace_id + azure_client_id + azure_client_secret + azure_tenant_id for Azure Service Principal authentication.
5. Run `databricks configure --token` that will create ~/.databrickscfg file.
Please check https://github.com/databrickslabs/terraform-provider-databricks/blob/master/docs/index.md#authentication for details
3. azure_databricks_workspace_id + AZ CLI authentication.
4. azure_databricks_workspace_id + azure_client_id + azure_client_secret + azure_tenant_id for Azure Service Principal authentication.
5. Run `databricks configure --token` that will create ~/.databrickscfg file.
Please check https://github.com/databrickslabs/terraform-provider-databricks/blob/master/docs/index.md#authentication for details
path=.terraform/plugins/registry.terraform.io/databrickslabs/databricks/0.2.8/darwin_amd64/terraform-provider-databricks_v0.2.8 pid=79751

Panic Output

N/A

Expected Behavior

Terraform should have been able to generate a plan:

Terraform will perform the following actions:

  # databricks_dbfs_file.init_script must be replaced
-/+ resource "databricks_dbfs_file" "init_script" {
      ~ content              = "SGVsbG8gV29ybGQ=" -> "SW5pdCBTY3JpcHQ=" # forces replacement
      ~ content_b64_md5      = "e87940b6df45d47268e367509f466873" -> "f2a8db2419723f6a64787a5a63554d77" # forces replacement
      ~ file_size            = 11 -> (known after apply)
      ~ id                   = "/databricks/scripts/script.sh" -> (known after apply)
        mkdirs               = true
        overwrite            = true
        path                 = "/databricks/scripts/script.sh"
        validate_remote_file = true
    }

Plan: 1 to add, 0 to change, 1 to destroy.

This plan was generated successfully when I placed .databrickscfg in my home directory.

Actual Behavior

The provider failed with the error:

Error: Authentication is not configured for provider. Please configure it
through one of the following options:
1. DATABRICKS_HOST + DATABRICKS_TOKEN environment variables.
2. host + token provider arguments.
3. azure_databricks_workspace_id + AZ CLI authentication.
4. azure_databricks_workspace_id + azure_client_id + azure_client_secret + azure_tenant_id for Azure Service Principal authentication.
5. Run `databricks configure --token` that will create ~/.databrickscfg file.

Please check https://github.com/databrickslabs/terraform-provider-databricks/blob/master/docs/index.md#authentication for details

Steps to Reproduce

Please list the steps required to reproduce the issue, for example:

  1. terraform apply

Important Factoids

N/A

@nfx
Copy link
Contributor

nfx commented Nov 27, 2020

@thaiphv Can you use ~/.databrickscfg file in the home directory?

I'm wondering if we should remove DATABRICKS_CONFIG_FILE env variable.

Pull Requests welcome, as always.

@thaiphv
Copy link
Contributor Author

thaiphv commented Nov 28, 2020

Hi @nfx, as I mentioned, I was able to run terraform plan when I placed .databrickscfg in the home directory. We do have a need to specify configuration file to a custom place because we'd like to store the file as credentials in Jenkins and specify the path to the file via the DATABRICKS_CONFIG_FILE environment variable in our pipelines.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants