Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE] Error deploying workspace to https://accounts-dod.cloud.databricks.us/ #998

Open
airizarryDB opened this issue Jul 24, 2024 · 0 comments

Comments

@airizarryDB
Copy link

Description
Received the following error deploying a workspace to https://accounts-dod.cloud.databricks.us/ via Terraform

Reproduction
I used https://github.com/databricks/terraform-databricks-sra. Ive added a branch for govcloud support which should be merged shortly

Expected behavior
I should be able to register my network object via the api, and then eventually have a functional workspace

Is it a regression?
This never worked to my knowledge

Debug Logs
│ Error: cannot create mws networks: unexpected error handling request: invalid character 'M' looking for beginning of value. This is likely a bug in the Databricks SDK for Go or the underlying REST API. Please report this issue with the following debugging information to the SDK issue tracker at https://github.com/databricks/databricks-sdk-go/issues. Request log:
│ ```
│ POST /api/2.0/accounts/b5d6deaa-82e8-4527-9ec5-51f9f305e6b3/networks
│ > * Host:
│ > * Accept: application/json
│ > * Authorization: REDACTED
│ > * Content-Type: application/json
│ > * Traceparent: 00-cb34323e01b10698fd81a0d1ebb66320-08782eb9d77bb045-01
│ > * User-Agent: databricks-tf-provider/1.46.0 databricks-sdk-go/0.41.0 go/1.21.10 os/darwin terraform/1.9.2 resource/mws_networks auth/oauth-m2m
│ > {
│ > "account_id": "b5d6deaa-82e8-4527-9ec5-51f9f305e6b3",
│ > "network_name": "govsratest-network",
│ > "security_group_ids": [
│ > "sg-0198d0c8f444c1a67"
│ > ],
│ > "subnet_ids": [
│ > "subnet-0893f61154c9d5d41",
│ > "subnet-04f95decf18ef5ba0",
│ > "subnet-0780582ec842b2467"
│ > ],
│ > "vpc_endpoints": {
│ > "dataplane_relay": [
│ > "81a167ad-ebed-4fb0-8a1a-aaa0f3fb5a54"
│ > ],
│ > "rest_api": [
│ > "3b7d15eb-4b21-4487-8d04-d7e540efdb1f"
│ > ]
│ > },
│ > "vpc_id": "vpc-075f51cd4addb76b4"
│ > }
│ < HTTP/2.0 400 Bad Request
│ < * Content-Type: text/plain; charset=utf-8
│ < * Date: Wed, 24 Jul 2024 21:19:40 GMT
│ < * Server: databricks
│ < * Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
│ < * Vary: Accept-Encoding
│ < * X-Content-Type-Options: nosniff
│ < MALFORMED_REQUEST: vpc_endpoints malformed parameters: VPC Endpoint 3b7d15eb-4b21-4487-8d04-d7e540efdb1f with use_case DATAPLANE_RELAY_ACCESS cannot be attached in rest_api list, VPC Endpoint 81a167ad-ebed-4fb0-8a1a-aaa0f3fb5a54 with use_case WORKSPACE_ACCESS cannot be attached in dataplane_relay list

Other Information

  • OS: tried from MacOS terminal
  • Version: 14.4.1

From Terraform init
Initializing provider plugins...

  • Finding hashicorp/aws versions matching ">= 3.28.0, >= 5.0.0"...
  • Finding latest version of hashicorp/external...
  • Finding latest version of hashicorp/null...
  • Finding latest version of hashicorp/time...
  • Finding databricks/databricks versions matching "~> 1.46.0"...
  • Installing hashicorp/time v0.12.0...
  • Installed hashicorp/time v0.12.0 (signed by HashiCorp)
  • Installing databricks/databricks v1.46.0...
  • Installed databricks/databricks v1.46.0 (self-signed, key ID 92A95A66446BCE3F)
  • Installing hashicorp/aws v5.59.0...
  • Installed hashicorp/aws v5.59.0 (signed by HashiCorp)
  • Installing hashicorp/external v2.3.3...
  • Installed hashicorp/external v2.3.3 (signed by HashiCorp)
  • Installing hashicorp/null v3.2.2...
  • Installed hashicorp/null v3.2.2 (signed by HashiCorp)

Additional context
Add any other context about the problem here.

github-merge-queue bot pushed a commit that referenced this issue Aug 29, 2024
## Changes
Some errors returned by the platform are not serialized using JSON (see
#998 for an
example). They are instead serialized in the form "<ERROR_CODE>:
<MESSAGE>". Today, the SDK cannot parse these error messages well,
resulting in a poor user experience.

This PR adds support for parsing these error messages from the platform
to the SDK. This should reduce bug reports for the SDK with respect to
unexpected response parsing. This PR also refactors the error
deserialization logic somewhat to make it more extensible in the future
for other potential error formats that are not currently handled.

## Breaking Changes
This PR renames MakeUnexpectedError() to MakeUnexpectedResponse() in the
`apierr` package. It also changes the return type to string. This makes
the message easier to incorporate into error responses that only depend
on the string representation of the error, as well as allows us to start
the message with a capital letter, as it is a complete sentence.

The error message for failed deserialization of valid responses has
changed slightly, from `unexpected error handling request` to `failed to
unmarshal response body`. The rest of the message is identical.

## Tests
Refactored unit tests to a table-driven test case, and added four new
cases: one for error details (not previously covered), one for the
regular happy path, one for unexpected responses, and one for the new
error message format.

- [ ] `make test` passing
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

---------

Co-authored-by: Renaud Hartert <renaud.hartert@databricks.com>
github-merge-queue bot pushed a commit to databricks/databricks-sdk-py that referenced this issue Aug 30, 2024
## Changes
Some errors returned by the platform are not serialized using JSON (see
databricks/databricks-sdk-go#998 for an
example). They are instead serialized in the form "<ERROR_CODE>:
<MESSAGE>". Today, the SDK cannot parse these error messages well,
resulting in a poor user experience.

This PR adds support for parsing these error messages from the platform
to the SDK. This should reduce bug reports for the SDK with respect to
unexpected response parsing. This PR also refactors the error
deserialization logic somewhat to make it more extensible in the future
for other potential error formats that are not currently handled.

As a side-effect of this change, I've refactored the structure of the
error handling in the Python SDK to more closely reflect how errors are
handled in the Go SDK. This should make maintenance more straightforward
in the future. It also introduces a new error message to the Python SDK
to refer users to our issue tracker when the SDK receives an error
response that it cannot parse, like what we do in the Go SDK.

Ports databricks/databricks-sdk-go#1031 to the
Python SDK.

## Deprecations
This PR deprecates several fields in the constructor for
DatabricksError. Going forward, SCIM-specific and API 1.2-specific
parameters should not be specified in the constructor; instead, they
will be handled in error parsers.

## Breaking Changes
The introduction of a different message for non-JSON responses may be a
breaking change if users matched on the message structure used before.

## Tests
Existing tests still pass, adding tests before merging this.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant