-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better error message when private link enabled workspaces reject requests #290
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
hectorcast-db
approved these changes
May 17, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should add some unit tests here too.
Added unit tests |
hectorcast-db
added a commit
that referenced
this pull request
May 22, 2024
### Improvements * Better error message when private link enabled workspaces reject requests ([#290](#290)). ### API Changes: * Changed `list()` method for `workspaceClient.connections()` service to require request of `com.databricks.sdk.service.catalog.ListConnectionsRequest` class. * Renamed `workspaceClient.lakehouseMonitors()` service to `workspaceClient.qualityMonitors()`. * Renamed `com.databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest` class to `com.databricks.sdk.service.catalog.DeleteQualityMonitorRequest`. * Changed `schemaName` field for `com.databricks.sdk.service.catalog.DisableRequest` to `String` class. * Removed `com.databricks.sdk.service.catalog.DisableSchemaName` class. * Changed `schemaName` field for `com.databricks.sdk.service.catalog.EnableRequest` to `String` class. * Removed `com.databricks.sdk.service.catalog.EnableSchemaName` class. * Renamed `com.databricks.sdk.service.catalog.GetLakehouseMonitorRequest` class to `com.databricks.sdk.service.catalog.GetQualityMonitorRequest`. * Added `nextPageToken` field for `com.databricks.sdk.service.catalog.ListConnectionsResponse`. * Added `dashboardId` field for `com.databricks.sdk.service.catalog.UpdateMonitor`. * Added `com.databricks.sdk.service.catalog.ListConnectionsRequest` class. * Added `com.databricks.sdk.service.catalog.MonitorRefreshListResponse` class. * Changed `clusterStatus()` method for `workspaceClient.libraries()` service to return `com.databricks.sdk.service.compute.ClusterLibraryStatuses` class. * Removed `clusterSource` field for `com.databricks.sdk.service.compute.ClusterAttributes`. * Changed `spec` field for `com.databricks.sdk.service.compute.ClusterDetails` to `com.databricks.sdk.service.compute.ClusterSpec` class. * Removed `cloneFrom` and `clusterSource` fields for `com.databricks.sdk.service.compute.ClusterSpec`. * Removed `com.databricks.sdk.service.compute.ClusterStatusResponse` class. * Removed `clusterSource` field for `com.databricks.sdk.service.compute.CreateCluster`. * Removed `cloneFrom` and `clusterSource` fields for `com.databricks.sdk.service.compute.EditCluster`. * Removed `sortBySpec` field for `com.databricks.sdk.service.marketplace.ListListingsRequest`. * Added `isAscending` field for `com.databricks.sdk.service.marketplace.ListListingsRequest`. * Added `sortBy` field for `com.databricks.sdk.service.marketplace.ListListingsRequest`. * Added `isAscending` field for `com.databricks.sdk.service.marketplace.SearchListingsRequest`. * Removed `com.databricks.sdk.service.marketplace.SortBySpec` and `com.databricks.sdk.service.marketplace.SortOrder` classes. * Added `gatewayDefinition` field for `com.databricks.sdk.service.pipelines.CreatePipeline`. * Added `gatewayDefinition` field for `com.databricks.sdk.service.pipelines.EditPipeline`. * Added `tableConfiguration` field for `com.databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`. * Added `gatewayDefinition` field for `com.databricks.sdk.service.pipelines.PipelineSpec`. * Added `tableConfiguration` field for `com.databricks.sdk.service.pipelines.SchemaSpec`. * Added `tableConfiguration` field for `com.databricks.sdk.service.pipelines.TableSpec`. * Added `com.databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition` class. * Added `com.databricks.sdk.service.pipelines.TableSpecificConfig` class. * Added `com.databricks.sdk.service.pipelines.TableSpecificConfigScdType` class. * Added `deploymentArtifacts` field for `com.databricks.sdk.service.serving.AppDeployment`. * Added `contents` field for `com.databricks.sdk.service.serving.ExportMetricsResponse`. * Changed `openaiApiKey` field for `com.databricks.sdk.service.serving.OpenAiConfig` to no longer be required. * Added `microsoftEntraClientId`, `microsoftEntraClientSecret` and `microsoftEntraTenantId` fields for `com.databricks.sdk.service.serving.OpenAiConfig`. * Added `com.databricks.sdk.service.serving.AppDeploymentArtifacts` class. * Added `storageRoot` field for `com.databricks.sdk.service.sharing.CreateShare`. * Added `storageLocation` and `storageRoot` fields for `com.databricks.sdk.service.sharing.ShareInfo`. * Added `storageRoot` field for `com.databricks.sdk.service.sharing.UpdateShare`. * Added `scanIndex()` method for `workspaceClient.vectorSearchIndexes()` service. * Added `embeddingWritebackTable` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecRequest`. * Added `embeddingWritebackTable` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Added `com.databricks.sdk.service.vectorsearch.ListValue`, `com.databricks.sdk.service.vectorsearch.MapStringValueEntry`, `com.databricks.sdk.service.vectorsearch.ScanVectorIndexRequest`, `com.databricks.sdk.service.vectorsearch.ScanVectorIndexResponse`, `com.databricks.sdk.service.vectorsearch.Struct`and `com.databricks.sdk.service.vectorsearch.Value` classes. OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
Merged
github-merge-queue bot
pushed a commit
that referenced
this pull request
May 23, 2024
### Improvements * Better error message when private link enabled workspaces reject requests ([#290](#290)). ### API Changes: * Changed `list()` method for `workspaceClient.connections()` service to require request of `com.databricks.sdk.service.catalog.ListConnectionsRequest` class. * Renamed `workspaceClient.lakehouseMonitors()` service to `workspaceClient.qualityMonitors()`. * Renamed `com.databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest` class to `com.databricks.sdk.service.catalog.DeleteQualityMonitorRequest`. * Changed `schemaName` field for `com.databricks.sdk.service.catalog.DisableRequest` to `String` class. * Removed `com.databricks.sdk.service.catalog.DisableSchemaName` class. * Changed `schemaName` field for `com.databricks.sdk.service.catalog.EnableRequest` to `String` class. * Removed `com.databricks.sdk.service.catalog.EnableSchemaName` class. * Renamed `com.databricks.sdk.service.catalog.GetLakehouseMonitorRequest` class to `com.databricks.sdk.service.catalog.GetQualityMonitorRequest`. * Added `nextPageToken` field for `com.databricks.sdk.service.catalog.ListConnectionsResponse`. * Added `dashboardId` field for `com.databricks.sdk.service.catalog.UpdateMonitor`. * Added `com.databricks.sdk.service.catalog.ListConnectionsRequest` class. * Added `com.databricks.sdk.service.catalog.MonitorRefreshListResponse` class. * Changed `clusterStatus()` method for `workspaceClient.libraries()` service to return `com.databricks.sdk.service.compute.ClusterLibraryStatuses` class. * Removed `clusterSource` field for `com.databricks.sdk.service.compute.ClusterAttributes`. * Changed `spec` field for `com.databricks.sdk.service.compute.ClusterDetails` to `com.databricks.sdk.service.compute.ClusterSpec` class. * Removed `cloneFrom` and `clusterSource` fields for `com.databricks.sdk.service.compute.ClusterSpec`. * Removed `com.databricks.sdk.service.compute.ClusterStatusResponse` class. * Removed `clusterSource` field for `com.databricks.sdk.service.compute.CreateCluster`. * Removed `cloneFrom` and `clusterSource` fields for `com.databricks.sdk.service.compute.EditCluster`. * Removed `sortBySpec` field for `com.databricks.sdk.service.marketplace.ListListingsRequest`. * Added `isAscending` field for `com.databricks.sdk.service.marketplace.ListListingsRequest`. * Added `sortBy` field for `com.databricks.sdk.service.marketplace.ListListingsRequest`. * Added `isAscending` field for `com.databricks.sdk.service.marketplace.SearchListingsRequest`. * Removed `com.databricks.sdk.service.marketplace.SortBySpec` and `com.databricks.sdk.service.marketplace.SortOrder` classes. * Added `gatewayDefinition` field for `com.databricks.sdk.service.pipelines.CreatePipeline`. * Added `gatewayDefinition` field for `com.databricks.sdk.service.pipelines.EditPipeline`. * Added `tableConfiguration` field for `com.databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`. * Added `gatewayDefinition` field for `com.databricks.sdk.service.pipelines.PipelineSpec`. * Added `tableConfiguration` field for `com.databricks.sdk.service.pipelines.SchemaSpec`. * Added `tableConfiguration` field for `com.databricks.sdk.service.pipelines.TableSpec`. * Added `com.databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition` class. * Added `com.databricks.sdk.service.pipelines.TableSpecificConfig` class. * Added `com.databricks.sdk.service.pipelines.TableSpecificConfigScdType` class. * Added `deploymentArtifacts` field for `com.databricks.sdk.service.serving.AppDeployment`. * Added `contents` field for `com.databricks.sdk.service.serving.ExportMetricsResponse`. * Changed `openaiApiKey` field for `com.databricks.sdk.service.serving.OpenAiConfig` to no longer be required. * Added `microsoftEntraClientId`, `microsoftEntraClientSecret` and `microsoftEntraTenantId` fields for `com.databricks.sdk.service.serving.OpenAiConfig`. * Added `com.databricks.sdk.service.serving.AppDeploymentArtifacts` class. * Added `storageRoot` field for `com.databricks.sdk.service.sharing.CreateShare`. * Added `storageLocation` and `storageRoot` fields for `com.databricks.sdk.service.sharing.ShareInfo`. * Added `storageRoot` field for `com.databricks.sdk.service.sharing.UpdateShare`. * Added `scanIndex()` method for `workspaceClient.vectorSearchIndexes()` service. * Added `embeddingWritebackTable` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecRequest`. * Added `embeddingWritebackTable` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Added `com.databricks.sdk.service.vectorsearch.ListValue`, `com.databricks.sdk.service.vectorsearch.MapStringValueEntry`, `com.databricks.sdk.service.vectorsearch.ScanVectorIndexRequest`, `com.databricks.sdk.service.vectorsearch.ScanVectorIndexResponse`, `com.databricks.sdk.service.vectorsearch.Struct`and `com.databricks.sdk.service.vectorsearch.Value` classes. OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
github-merge-queue bot
pushed a commit
that referenced
this pull request
Jul 9, 2024
…ient` (#305) ## Changes This PR addresses a regression reportedly introduced in PR #290 when connecting via a proxy that requires authentication (see context). The problem comes from a casting exception in `CommonHttpClient` which attempts to cast a `BasicHttpRequest` (which represents the next request after receiving the auth required response) into a `HttpUriRequest`. This PR solves the casting issue by processing the ancestor of both classes, `HttpRequest`. ### Context The regression was detected when trying to authenticate via a Proxy that requires authentication. In particular, the request to respond to the proxy's authentication request is automatically created as a `BasicHttpRequest` by the apache `HttpClient`. ## Tests Unit tests and integration tests are passing. The fix was also verified by the user who uncovered the issue. Note: we should add a regression test that simulate the proxy setting. However, I'd like to take the time to experiment with different configurations. Given that the change is relatively simple, I'd recommend to proceed with this PR and add the test in a follow-up PR.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
Port of databricks/databricks-sdk-go#924 to the Java SDK.
When a user tries to access a Private Link-enabled workspace configured with no public internet access from a different network than the VPC endpoint belongs to, the Private Link backend redirects the user to the login page, rather than outright rejecting the request. The login page, however, is not a JSON document and cannot be parsed by the SDK, resulting in this error message:
To address this, I add one additional check in the error mapper logic to inspect whether the user was redirected to the login page with the private link validation error response code. If so, we return a synthetic error with error code
PRIVATE_LINK_VALIDATION_ERROR
that inherits from PermissionDenied and has a mock 403 status code.After this change, users will see an error message like this:
The error message is tuned to the specific cloud so that we can redirect users to the appropriate documentation, the cloud being inferred from the request URI.
Tests
Unit tests cover the private link error message mapping.
To manually test this, I created a private link workspace in Azure, created an access token, restricted access to the workspace, then ran the
ListClustersExample
example using the host & token: