Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE] Issue with databricks provider m2m authentication #2042

Closed
BostjanBozic opened this issue Feb 24, 2023 · 9 comments
Closed

[ISSUE] Issue with databricks provider m2m authentication #2042

BostjanBozic opened this issue Feb 24, 2023 · 9 comments

Comments

@BostjanBozic
Copy link

Configuration

provider "databricks" {
  alias          = "mws"
  host           = "https://accounts.cloud.databricks.com"
  account_id     = var.databricks_account.account_id
  client_id      = var.databricks_account.client_id
  client_secret  = var.databricks_account.client_secret
  token_endpoint = "https://accounts.cloud.databricks.com/oidc/accounts/${var.databricks_account.account_id}/v1/token"
}

Expected Behavior

That authentication works without any issues.

Actual Behavior

Authentication fails with following error:

╷
│ Error: Unsupported argument
│ 
│   on databricks/provider.tf line 7, in provider "databricks":
│    7:   token_endpoint = "https://accounts.cloud.databricks.com/oidc/accounts/${var.databricks_account.account_id}/v1/token"
│ 
│ An argument named "token_endpoint" is not expected here.

Steps to Reproduce

  1. terraform apply

Terraform and provider versions

Terraform v1.3.9
on darwin_arm64
+ provider registry.terraform.io/databricks/databricks v1.10.0

Debug Output

Important Factoids

Yes, the important thing is that we have "Service Principal OAuth token on Databricks account level" private preview enabled and this is the reason why we are using token_endpoint parameter in provider configuration. It worked with provider version v1.9.2, but it no longer works with v1.10.0.

I would expect the problem is migration to Go SDK for configuration and http client (#1848). I understand this is in private preview, but I am posting this here so that once it comes out, this will most likely have to be implemented in Go SDK.

@alexott
Copy link
Contributor

alexott commented Feb 24, 2023

You don't need this parameter anymore, it's automatically handled by the Go SDK: https://github.com/databricks/databricks-sdk-go/blob/main/config/auth_m2m.go#L45

@BostjanBozic
Copy link
Author

@alexott Thank you for pointing this out, sometimes it makes sense just to try it out without :)

I tried it without and in this case though all resources that are using account-level provider are failing:

╷
│ Error: cannot read mws vpc endpoint: Cannot complete request; user is unauthenticated
│ 
│   with module.databricks.databricks_mws_vpc_endpoint.api,
│   on databricks/network.tf line 1, in resource "databricks_mws_vpc_endpoint" "api":
│    1: resource "databricks_mws_vpc_endpoint" "api" {
│ 
╵
╷
│ Error: cannot read mws vpc endpoint: Cannot complete request; user is unauthenticated
│ 
│   with module.databricks.databricks_mws_vpc_endpoint.relay,
│   on databricks/network.tf line 9, in resource "databricks_mws_vpc_endpoint" "relay":
│    9: resource "databricks_mws_vpc_endpoint" "relay" {
│ 
╵

As mentioned, if I am using provider v1.9.2, everything works without any issues. Is there also a difference in parameters when using service principal authentication?

@alexott
Copy link
Contributor

alexott commented Feb 24, 2023

I'm not aware about it. @nfx - do you remember what has changed there

@nkvuong nkvuong changed the title [ISSUE] Issue with databricks provider authentication [ISSUE] Issue with databricks provider m2m authentication Feb 24, 2023
@nfx
Copy link
Contributor

nfx commented Feb 24, 2023

@BostjanBozic token_endpoint can be removed - it's determined automatically since 1.10 and migration to Go SDK. let me know if it doesn't work - happy to jump on a call. Please ping me on my databricks email address and CC your Solutions Architect and Account Executive.

@BostjanBozic
Copy link
Author

@nfx Thank you for feedback. I tried to remove it already and then the error that I get is one above (user is unauthenticated).

Perfect, I will send you an email regarding this and we can schedule a short call.

@BostjanBozic
Copy link
Author

Thank @nfx for call today. This issue was resolved by renaming DEFAULT profile in ~/.databrickscfg to something else. Problem was (to my understanding) that provider was taking credentials from config file event though credentials were configured within provider specification.

@nfx
Copy link
Contributor

nfx commented Feb 24, 2023

databricks/databricks-sdk-go#315 preparing the fix

@vmazobhc
Copy link

vmazobhc commented Mar 2, 2023

Just switched Databricks provider 1.9.2 to 1.10 seeing lots of auth errors in Terraform azure databricks

what is going on with the " not authorized" what is triggering this in 1.9.2 there is no issue what do I need to adjust

@nfx

│ Error: cannot read global init script: User not authorized │ │ with module.engineering_workspace.module.metastore_init.databricks_global_init_script.query, │ on ../../../../../modules/azure/databricks/external-metastore/main.tf line 1, in resource "databricks_global_init_script" "query": │ 1: resource "databricks_global_init_script" "query" { │ ╵ ╷ │ Error: cannot read global init script: User not authorized │ │ with module.integration_workspace.module.metastore_init.databricks_global_init_script.query, │ on ../../../../../modules/azure/databricks/external-metastore/main.tf line 1, in resource "databricks_global_init_script" "query": │ 1: resource "databricks_global_init_script" "query" { │ ╵ ╷ │ Error: User not authorized │ │ with module.engineering_workspace.data.databricks_spark_version.bhg, │ on ../../../../../modules/azure/databricks/workspace/clusters.tf line 1, in data "databricks_spark_version" "bhg": │ 1: data "databricks_spark_version" "bhg" { │ ╵ ╷ │ Error: inner token: token error: {"error":"invalid_request","error_description":"Temporarily throttled, too many requests"} │ │ with module.integration_workspace.data.databricks_spark_version.bhg, │ on ../../../../../modules/azure/databricks/workspace/clusters.tf line 1, in data "databricks_spark_version" "bhg": │ 1: data "databricks_spark_version" "bhg" { │ ╵ │

Copy link
Contributor

nfx commented Mar 2, 2023

I see “temporarily throttled” error message in the stack trace. Please open new issue with the details on how authentication is configured

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants