Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PECO-1542] Add support for proxy #242

Closed
wants to merge 37 commits into from

Conversation

vikrantpuppala
Copy link
Contributor

@vikrantpuppala vikrantpuppala commented Feb 27, 2024

Changes

This PR adds a way to provide proxy configs in the SDK directly, ability to pick up the system properties and authentication for proxy via basic or negotiate-kerberos schemes

Tests

These changes were tested in an environment consisting of a proxy server with kerberos authentication hosted on Ubuntu and Windows VMs. The detailed testing details are documented here: https://docs.google.com/document/d/1Zxlfx-R_JytFqMZfQMBfSQ8mj4W-aHa60vw4KdjUkEc/edit

@vikrantpuppala
Copy link
Contributor Author

@gopalldb seems like there is some work in parallel for proxy support: 7788864

@mgyucht
Copy link
Contributor

mgyucht commented Feb 27, 2024

@vikrantpuppala @gopalldb can you see if the linked commit works for you? We may want to consider a standard set of environment variables as well, but different languages seem to have different standards for propagating proxy information. If system properties is sufficient, I would like to stick with that for now.

@vikrantpuppala
Copy link
Contributor Author

@mgyucht I think system properties should be sufficient whenever we want to inherit system proxy. But we also need a way to provide proxy properties directly (had added those via DatabricksConfig for now in this PR to quickly test for a POC)

@mgyucht
Copy link
Contributor

mgyucht commented Feb 27, 2024

Which properties do you need to set? I think you can set proxy username/password with https.proxyUser/https.proxyPassword, according to SystemDefaultCredentialsProvider. See https://stackoverflow.com/questions/1626549/authenticated-http-proxy-with-java.

@vikrantpuppala
Copy link
Contributor Author

In certain cases, customers would not have a system wide proxy but would have a proxy only for the jdbc driver (or different proxies across different applications). Hence, we support providing proxy host/port/creds directly to the driver which we want to pass on to the http client

@gopalldb
Copy link
Contributor

@mgyucht to add some context, we have customers who would like to have different set of proxy configurations for API endpoint and Cloud fetch Urls while using the same JDBC driver. Thus a global configuration may not work for all, and we may need to support custom configuration (may be through DatabricksConfig).

@vikrantpuppala vikrantpuppala changed the title [WIP] Add support for proxy [PECO-1542] Add support for proxy Mar 26, 2024
Copy link
Contributor

@gopalldb gopalldb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG, can we tests?

private String username;
private String password;
private ProxyAuthType proxyAuthType;
private boolean useSystemProperties;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this default to true? If users want to use system properties, it seems strange that they would need to set it explicitly.

If we're introducing this, we should also use it to control whether .useSystemProperties() is called on line 75.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure about this one, I felt like my app should be independent of system proxy settings unless i choose that explicitly but I understand the other way could also feel natural.

Copy link
Contributor

@mgyucht mgyucht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly LGTM, few smaller remarks.

vikrantpuppala and others added 11 commits April 23, 2024 13:37
## Changes
This PR sets `useSystemProperties()` when building the Commons HTTP
client. This allows users to configure an HTTP proxy by setting the
`https.proxyHost` and `https.proxyPort` parameters (see
https://docs.oracle.com/javase/6/docs/technotes/guides/net/proxies.html).

Closes databricks#111.

## Tests
Added an example. Started the HTTP proxy from
databricks/databricks-sdk-go#825, created a run
configuration for the new example that sets `https.proxyHost` to
`localhost` and `https.proxyPort` to `8443`, and run. I see the proxy
server handling a request and the example completes successfully.
## Changes
<!-- Summary of your changes that are easy to understand -->
Generating SDK to latest OpenAPI specification

## Tests
<!-- How is this tested? -->
Unit tests, nightly tests will run before release
## 0.20.0

### Features and Improvements
* Added basic support for HTTP proxies
([databricks#241](databricks#241)).
* Fixed getWorkspaceClient() for GCP
([databricks#224](databricks#224)).
* Note: Backwards incompatible changes - Settings are now nested, please
see the API changes below.

### Internal Changes
* Reading headers should be done in a case-insensitive manner
([databricks#235](databricks#235)).
* Added integration tests for the Files API
([databricks#236](databricks#236)).
* Supported subservices
([databricks#237](databricks#237)).
* Handled empty types in the Java SDK
([databricks#239](databricks#239)).
* Added tokei.rs lines of code badge
([databricks#243](databricks#243)).
* Updated SDK to latest OpenAPI spec
([databricks#245](databricks#245)).

### API Changes:
 * Added the following services:
    - `workspaceClient.permissionMigration()`
    - `workspaceClient.automaticClusterUpdate()`
    - `workspaceClient.cspEnablement()`
    - `accountClient.cspEnablementAccount()`
    - `workspaceClient.defaultNamespace()`
    - `workspaceClient.esmEnablement()`
    - `accountClient.esmEnablementAccount()`
    - `accountClient.personalCompute()`
    - `workspaceClient.restrictWorkspaceAdmins()`
 * Added the following classes:
    - `com.databricks.sdk.service.iam.PermissionMigrationRequest`
    - `com.databricks.sdk.service.iam.PermissionMigrationResponse`
- `com.databricks.sdk.service.settings.AutomaticClusterUpdateSetting`
    - `com.databricks.sdk.service.settings.ClusterAutoRestartMessage`
-
`com.databricks.sdk.service.settings.ClusterAutoRestartMessageEnablementDetails`
-
`com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindow`
-
`com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowDayOfWeek`
-
`com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWeekDayBasedSchedule`
-
`com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWeekDayFrequency`
-
`com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWindowStartTime`
    - `com.databricks.sdk.service.settings.ComplianceStandard`
    - `com.databricks.sdk.service.settings.CspEnablement`
    - `com.databricks.sdk.service.settings.CspEnablementAccount`
    - `com.databricks.sdk.service.settings.CspEnablementAccountSetting`
    - `com.databricks.sdk.service.settings.CspEnablementSetting`
- `com.databricks.sdk.service.settings.DeleteDefaultNamespaceRequest`
    - `com.databricks.sdk.service.settings.DeletePersonalComputeRequest`
-
`com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminRequest`
    - `com.databricks.sdk.service.settings.EsmEnablement`
    - `com.databricks.sdk.service.settings.EsmEnablementAccount`
    - `com.databricks.sdk.service.settings.EsmEnablementAccountSetting`
    - `com.databricks.sdk.service.settings.EsmEnablementSetting`
- `com.databricks.sdk.service.settings.GetAutomaticClusterUpdateRequest`
- `com.databricks.sdk.service.settings.GetCspEnablementAccountRequest`
    - `com.databricks.sdk.service.settings.GetCspEnablementRequest`
    - `com.databricks.sdk.service.settings.GetDefaultNamespaceRequest`
- `com.databricks.sdk.service.settings.GetEsmEnablementAccountRequest`
    - `com.databricks.sdk.service.settings.GetEsmEnablementRequest`
    - `com.databricks.sdk.service.settings.GetPersonalComputeRequest`
- `com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminRequest`
    - `com.databricks.sdk.service.settings.NccAwsStableIpRule`
-
`com.databricks.sdk.service.settings.UpdateAutomaticClusterUpdateSettingRequest`
-
`com.databricks.sdk.service.settings.UpdateCspEnablementAccountSettingRequest`
-
`com.databricks.sdk.service.settings.UpdateCspEnablementSettingRequest`
-
`com.databricks.sdk.service.settings.UpdateEsmEnablementAccountSettingRequest`
-
`com.databricks.sdk.service.settings.UpdateEsmEnablementSettingRequest`
 * Removed the follogin classes:
-
`com.databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest`
-
`com.databricks.sdk.service.settings.DeletePersonalComputeSettingRequest`
-
`com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest`
-
`com.databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest`
- `com.databricks.sdk.service.settings.GetPersonalComputeSettingRequest`
-
`com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest`
* Changed `version` field for
`com.databricks.sdk.service.serving.AppManifest` to
`com.databricks.sdk.service.serving.AnyValue` class.
* Removed `deletePersonalComputeSetting()`,
`getPersonalComputeSetting()` and `updatePersonalComputeSetting()`
method for `accountClient.settings()` service.
* Removed `deleteDefaultNamespaceSetting()`,
`deleteRestrictWorkspaceAdminsSetting()`,
`getDefaultNamespaceSetting()`, `getRestrictWorkspaceAdminsSetting()`,
`updateDefaultNamespaceSetting()` and
`updateRestrictWorkspaceAdminsSetting()` method for
`workspaceClient.settings()` service.
* Added `awsStableIpRule` field for
`com.databricks.sdk.service.settings.NccEgressDefaultRules`.
* Added `indexName` field for
`com.databricks.sdk.service.vectorsearch.DeleteDataVectorIndexRequest`.
* Added `embeddingModelEndpointName` field for
`com.databricks.sdk.service.vectorsearch.EmbeddingSourceColumn`.
* Added `indexName` field for
`com.databricks.sdk.service.vectorsearch.UpsertDataVectorIndexRequest`.
* Added `deltaSyncIndexSpec` field for
`com.databricks.sdk.service.vectorsearch.VectorIndex`.
* Added `directAccessIndexSpec` field for
`com.databricks.sdk.service.vectorsearch.VectorIndex`.
* Changed `deleteEndpoint()`, `createIndex()`, `deleteDataVectorIndex()`
and `upsertDataVectorIndex()` method for
`workspaceClient.vectorSearchEndpoints()` service with new required
argument order.
* Changed `endpointName` field for
`com.databricks.sdk.service.vectorsearch.CreateVectorIndexRequest` to be
required.
* Removed `planningPhases` field for
`com.databricks.sdk.service.sql.QueryMetrics`.
* Removed `name` field for
`com.databricks.sdk.service.vectorsearch.DeleteDataVectorIndexRequest`.
* Removed `name` field for
`com.databricks.sdk.service.vectorsearch.DeleteEndpointRequest`.
* Removed `com.databricks.sdk.service.vectorsearch.EmbeddingConfig`
class.
* Removed `embeddingConfig` field for
`com.databricks.sdk.service.vectorsearch.EmbeddingSourceColumn`.
* Removed `name` field for
`com.databricks.sdk.service.vectorsearch.UpsertDataVectorIndexRequest`.
* Removed `deltaSyncVectorIndexSpec` field for
`com.databricks.sdk.service.vectorsearch.VectorIndex`.
* Removed `directAccessVectorIndexSpec` field for
`com.databricks.sdk.service.vectorsearch.VectorIndex`.

OpenAPI SHA: d855b30f25a06fe84f25214efa20e7f1fffcdf9e, Date: 2024-03-04
API Changes:

* Changed `list()` method for `workspaceClient.catalogs()` service to
require request of
`com.databricks.sdk.service.catalog.ListCatalogsRequest` class.
* Changed `create()` method for `workspaceClient.onlineTables()` service
. New request type is
`com.databricks.sdk.service.catalog.CreateOnlineTableRequest` class.
 * Removed `com.databricks.sdk.service.catalog.AwsIamRole` class.
* Changed `notifications` field for
`com.databricks.sdk.service.catalog.CreateMonitor` to
`com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class.
* Changed `awsIamRole` field for
`com.databricks.sdk.service.catalog.CreateStorageCredential` to
`com.databricks.sdk.service.catalog.AwsIamRoleRequest` class.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.ExternalLocationInfo`.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.FunctionInfo`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetCatalogRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetExternalLocationRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetFunctionRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetModelVersionRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetRegisteredModelRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetSchemaRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.GetTableRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListExternalLocationsRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListFunctionsRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListModelVersionsRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListRegisteredModelsRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListSchemasRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListTablesRequest`.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ListVolumesRequest`.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.ModelVersionInfo`.
* Changed `notifications` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to
`com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class.
* Added `includeBrowse` field for
`com.databricks.sdk.service.catalog.ReadVolumeRequest`.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.RegisteredModelInfo`.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.SchemaInfo`.
* Changed `awsIamRole` field for
`com.databricks.sdk.service.catalog.StorageCredentialInfo` to
`com.databricks.sdk.service.catalog.AwsIamRoleResponse` class.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.TableInfo`.
* Changed `notifications` field for
`com.databricks.sdk.service.catalog.UpdateMonitor` to
`com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class.
* Changed `awsIamRole` field for
`com.databricks.sdk.service.catalog.UpdateStorageCredential` to
`com.databricks.sdk.service.catalog.AwsIamRoleRequest` class.
* Changed `awsIamRole` field for
`com.databricks.sdk.service.catalog.ValidateStorageCredential` to
`com.databricks.sdk.service.catalog.AwsIamRoleRequest` class.
 * Removed `com.databricks.sdk.service.catalog.ViewData` class.
* Added `browseOnly` field for
`com.databricks.sdk.service.catalog.VolumeInfo`.
 * Added `com.databricks.sdk.service.catalog.AwsIamRoleRequest` class.
 * Added `com.databricks.sdk.service.catalog.AwsIamRoleResponse` class.
* Added `com.databricks.sdk.service.catalog.CreateOnlineTableRequest`
class.
 * Added `com.databricks.sdk.service.catalog.ListCatalogsRequest` class.
* Changed `publish()` method for `workspaceClient.lakeview()` service to
return `com.databricks.sdk.service.dashboards.PublishedDashboard` class.
 * Added `create()` method for `workspaceClient.lakeview()` service.
 * Added `get()` method for `workspaceClient.lakeview()` service.
* Added `getPublished()` method for `workspaceClient.lakeview()`
service.
 * Added `trash()` method for `workspaceClient.lakeview()` service.
 * Added `update()` method for `workspaceClient.lakeview()` service.
 * Removed `Object` class.
* Added `com.databricks.sdk.service.dashboards.CreateDashboardRequest`
class.
 * Added `com.databricks.sdk.service.dashboards.Dashboard` class.
* Added `com.databricks.sdk.service.dashboards.GetDashboardRequest`
class.
* Added
`com.databricks.sdk.service.dashboards.GetPublishedDashboardRequest`
class.
 * Added `com.databricks.sdk.service.dashboards.LifecycleState` class.
* Added `com.databricks.sdk.service.dashboards.PublishedDashboard`
class.
* Added `com.databricks.sdk.service.dashboards.TrashDashboardRequest`
class.
 * Added `Object` class.
* Added `com.databricks.sdk.service.dashboards.UpdateDashboardRequest`
class.
* Added `autoCaptureConfig` field for
`com.databricks.sdk.service.serving.EndpointPendingConfig`.
* Changed `get()` method for `workspaceClient.automaticClusterUpdate()`
service . New request type is
`com.databricks.sdk.service.settings.GetAutomaticClusterUpdateSettingRequest`
class.
* Changed `get()` method for `workspaceClient.cspEnablement()` service .
New request type is
`com.databricks.sdk.service.settings.GetCspEnablementSettingRequest`
class.
* Changed `get()` method for `accountClient.cspEnablementAccount()`
service . New request type is
`com.databricks.sdk.service.settings.GetCspEnablementAccountSettingRequest`
class.
* Changed `delete()` method for `workspaceClient.defaultNamespace()`
service . New request type is
`com.databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest`
class.
* Changed `get()` method for `workspaceClient.defaultNamespace()`
service . New request type is
`com.databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest`
class.
* Changed `get()` method for `workspaceClient.esmEnablement()` service .
New request type is
`com.databricks.sdk.service.settings.GetEsmEnablementSettingRequest`
class.
* Changed `get()` method for `accountClient.esmEnablementAccount()`
service . New request type is
`com.databricks.sdk.service.settings.GetEsmEnablementAccountSettingRequest`
class.
* Changed `get()` method for `workspaceClient.ipAccessLists()` service .
New request type is
`com.databricks.sdk.service.settings.GetIpAccessList` class.
* Changed `delete()` method for `accountClient.personalCompute()`
service . New request type is
`com.databricks.sdk.service.settings.DeletePersonalComputeSettingRequest`
class.
* Changed `get()` method for `accountClient.personalCompute()` service .
New request type is
`com.databricks.sdk.service.settings.GetPersonalComputeSettingRequest`
class.
* Changed `delete()` method for
`workspaceClient.restrictWorkspaceAdmins()` service . New request type
is
`com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest`
class.
* Changed `get()` method for `workspaceClient.restrictWorkspaceAdmins()`
service . New request type is
`com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest`
class.
* Removed
`com.databricks.sdk.service.settings.DeleteDefaultNamespaceRequest`
class.
* Removed
`com.databricks.sdk.service.settings.DeletePersonalComputeRequest`
class.
* Removed
`com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminRequest`
class.
* Removed
`com.databricks.sdk.service.settings.GetAutomaticClusterUpdateRequest`
class.
* Removed
`com.databricks.sdk.service.settings.GetCspEnablementAccountRequest`
class.
* Removed `com.databricks.sdk.service.settings.GetCspEnablementRequest`
class.
* Removed
`com.databricks.sdk.service.settings.GetDefaultNamespaceRequest` class.
* Removed
`com.databricks.sdk.service.settings.GetEsmEnablementAccountRequest`
class.
* Removed `com.databricks.sdk.service.settings.GetEsmEnablementRequest`
class.
* Removed `com.databricks.sdk.service.settings.GetIpAccessListRequest`
class.
* Removed
`com.databricks.sdk.service.settings.GetPersonalComputeRequest` class.
* Removed
`com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminRequest`
class.
* Added
`com.databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.DeletePersonalComputeSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetAutomaticClusterUpdateSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetCspEnablementAccountSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetCspEnablementSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetEsmEnablementAccountSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetEsmEnablementSettingRequest`
class.
 * Added `com.databricks.sdk.service.settings.GetIpAccessList` class.
* Added
`com.databricks.sdk.service.settings.GetPersonalComputeSettingRequest`
class.
* Added
`com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest`
class.
* Changed `dataObjectType` field for
`com.databricks.sdk.service.sharing.SharedDataObject` to
`com.databricks.sdk.service.sharing.SharedDataObjectDataObjectType`
class.
* Added `content` field for
`com.databricks.sdk.service.sharing.SharedDataObject`.
* Added
`com.databricks.sdk.service.sharing.SharedDataObjectDataObjectType`
class.
* Added `embeddingSourceColumns` field for
`com.databricks.sdk.service.vectorsearch.DirectAccessVectorIndexSpec`.
* Added `scoreThreshold` field for
`com.databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`.

OpenAPI SHA: 93763b0d7ae908520c229c786fff28b8fd623261, Date: 2024-03-20
vikrantpuppala and others added 26 commits April 23, 2024 13:38
## Changes
Ports databricks/databricks-sdk-go#869 to Java
SDK.

Currently, path parameters are directly interpolated into the request
URL without escaping. This means that characters like `/`, `?` and `#`
will not be percent-encoded and will affect the semantics of the URL,
starting a new path segment, query parameters, or fragment,
respectively. This means that it is impossible for users of the Files
API to upload/download objects that contain `?` or `#` in their name.
`/` is allowed in the path of the Files API, so it does not need to be
escaped.

The Files API is currently marked with `x-databricks-multi-segment`,
indicating that it should be permitted to have `/` characters but other
characters need to be percent-encoded. This PR implements this.

## Tests
- [x] Unit test for multi-segment path escaping behavior.
- [x] Updated integration test to use # and ? symbols in the file name.
Improvements and Bug Fixes
* Properly escape multi-segment path parameters
([databricks#252](databricks#252)).

API Changes:

* Added `Migrate` and `Unpublish` methods for
[w.Lakeview](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI)
workspace-level service.
* Added
[dashboards.MigrateDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MigrateDashboardRequest).
* Added
[dashboards.UnpublishDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UnpublishDashboardRequest).
* Added `Description`, `QueueDuration` and `RepairHistory` fields for
[jobs.BaseRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun).
* Added `ComputeKey` and `JobClusterKey` fields for
[jobs.ClusterSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ClusterSpec).
* Changed `Left`, `Op` and `Right` fields for
[jobs.ConditionTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ConditionTask)
to be required.
* Changed `EditMode` field for
[jobs.CreateJob](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob)
to
[jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode).
* Replaced
[jobs.CreateJobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobEditMode)
to
[jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode).
* Changed `Url` field for
[jobs.FileArrivalTriggerConfiguration](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#FileArrivalTriggerConfiguration)
to be required.
* Changed `ErrorMessageStats` field for
[jobs.ForEachStats](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ForEachStats)
to
[jobs.ForEachTaskErrorMessageStatsList](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ForEachTaskErrorMessageStatsList).
* Changed `NewCluster` field for
[jobs.JobCluster](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCluster)
to be required.
* Changed `EditMode` field for
[jobs.JobSettings](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings)
to
[jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode).
* Replaced
[jobs.JobSettingsEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsEditMode)
by
[jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode).
* Changed `Metric`, `Op` and `Value` fields for
[jobs.JobsHealthRule](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule)
to be required.
* Changed `RunType` field for
[jobs.ListRunsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListRunsRequest)
to
[jobs.RunType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunType).
* Removed
[jobs.ListRunsRunType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListRunsRunType).
* Removed
[jobs.ParamPairs](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ParamPairs).
* Changed `PipelineId` field for
[jobs.PipelineTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PipelineTask)
to be required.
* Changed `EntryPoint` and `PackageName` fields for
[jobs.PythonWheelTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PythonWheelTask)
to be required.
* Changed `JobParameters` field for
[jobs.RepairRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RepairRun)
to map[string]`string`.
* Changed `BaseParameters` field for
[jobs.ResolvedNotebookTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedNotebookTaskValues)
to map[string]`string`.
* Changed `Parameters` field for
[jobs.ResolvedParamPairValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedParamPairValues)
to map[string]`string`.
* Changed `NamedParameters` field for
[jobs.ResolvedPythonWheelTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedPythonWheelTaskValues)
to map[string]`string`.
* Removed `NamedParameters` field for
[jobs.ResolvedRunJobTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedRunJobTaskValues).
* Changed `Parameters` field for
[jobs.ResolvedRunJobTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedRunJobTaskValues)
to map[string]`string`.
* Added `JobParameters` field for
[jobs.ResolvedRunJobTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedRunJobTaskValues).
* Added `Description` field for
[jobs.Run](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run).
* Added `QueueDuration` field for
[jobs.Run](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run).
* Changed `Op` field for
[jobs.RunConditionTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunConditionTask)
to
[jobs.ConditionTaskOp](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ConditionTaskOp).
* Removed
[jobs.RunConditionTaskOp](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunConditionTaskOp).
* Changed `Inputs` and `Task` fields for
[jobs.RunForEachTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunForEachTask)
to be required.
* Changed `JobParameters` field for
[jobs.RunJobTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask)
to map[string]`string`.
* Added `DbtCommands`, `JarParams`, `NotebookParams`, `PipelineParams`,
`PythonNamedParams`, `PythonParams`, `SparkSubmitParams` and `SqlParams`
fields for
[jobs.RunJobTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask).
* Changed `JobParameters` field for
[jobs.RunNow](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunNow)
to map[string]`string`.
* Added `Info` field for
[jobs.RunOutput](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput).
* Removed `JobParameters` field for
[jobs.RunParameters](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunParameters).
* Changed `TaskKey` field for
[jobs.RunTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask)
to be required.
* Added `ComputeKey`,`EmailNotifications`, `JobClusterKey`,
`NotificatioSettings`, `RunDuration`, `RunPageUrl`, `TimeoutSeconds` and
`WebhookNotifications` fields for
[jobs.RunTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask).
* Added `EndpointId` field for
[jobs.SqlQueryOutput](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SqlQueryOutput).
* Added `ConditionTask` field for
[jobs.SubmitRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun).
* Added `DbtCommands`, `JarParams`, `NotebookParams`, `PipelineParams`,
`PythonNamedParams`, `PythonParams`, `SparkSubmitParams` and `SqlParams`
field for
[jobs.SubmitRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun).
* Added `Description` field for
[jobs.SubmitTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask).
* Added `DisableAutoOptimization` field for
[jobs.Task](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task).
* Added `NoAlertForSkippedRuns` field for
[jobs.TaskEmailNotifications](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications).
* Added `TableUpdate` field for
[jobs.TriggerSettings](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings).
* Changed `Id` field for
[jobs.Webhook](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Webhook)
to be required.
* Changed `OnDurationWarningThresholdExceeded` field for
[jobs.WebhookNotifications](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications)
to
[jobs.WebhookList](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookList).
* Removed
[jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotificationsOnDurationWarningThresholdExceededItem).
* Added
[jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode).
* Removed
[serving.AwsBedrockConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AwsBedrockConfig).
* Removed
[serving.AwsBedrockConfigBedrockProvider](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AwsBedrockConfigBedrockProvider).
* Removed `AwsBedrockConfig` field for
[serving.ExternalModel](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel).
* Added `AmazonBedrockConfig` field for
[serving.ExternalModel](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel).
* Added
[serving.AmazonBedrockConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AmazonBedrockConfig).
* Added
[serving.AmazonBedrockConfigBedrockProvider](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AmazonBedrockConfigBedrockProvider).
* Changed `Get` method for
[w.IpAccessLists](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#IpAccessListsAPI)
workspace-level service . New request type is
[settings.GetIpAccessListRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetIpAccessListRequest).
* Renamed
[settings.GetIpAccessList](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetIpAccessList)
to
[settings.GetIpAccessListRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetIpAccessListRequest).

OpenAPI SHA: d38528c3e47dd81c9bdbd918272a3e49d36e09ce, Date: 2024-03-27
Adjusts the field accessibility changes so they are protected behind a
lock. Epic was running into this when we concurrently created
WorkspaceClients and they would often read properties that another
thread had re-set to unaccessible.

## Changes
Synchronize on the `field` object before we mutate and read it to ensure
accessibility maintains throughout the code block

## Tests
No new tests. Nothing new added so existing tests should ensure logic is
correct for config reading. Testing multithreaded races in this case
would prove difficult.
… `BadRequest`, `PermissionDenied`, `InternalError`, and others (databricks#185)

See implementations in other SDKs:

- Go: databricks/databricks-sdk-go#682
- Python: databricks/databricks-sdk-py#376

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
Co-authored-by: Tanmay Rustagi <tanmay.rustagi@databricks.com>
## Changes
Fix Changelog

## Tests
N/A
## Changes
The upcoming Marketplace API introduces an API to create an analytics
dashboard. This creation request is kind of a singleton, as it has no
parameters. To support this, we introduce a `POST()` method with no body
specified so that this resource can be created.

## Tests
<!-- How is this tested? -->
…databricks#257)

## Changes
Ports databricks/databricks-sdk-go#864 to the
Java SDK.
Most services use `RESOURCE_DOES_NOT_EXIST` error code with 404 status
code to indicate that a resource doesn't exist. However, for legacy
reasons, Jobs and Clusters services use `INVALID_PARAMETER_VALUE` error
code with 400 status code instead. This makes tools like Terraform and
UCX difficult to maintain, as these services need different error
handling logic. However, we can't change these behaviors as customers
already depend on the raw HTTP response status & contents.

This PR corrects these errors in the SDK itself. SDK users can then do
```java
try {
  BaseJob job = w.jobs().get("123");
} catch (ResourceDoesNotExist e) {
  ...
}
```
just as you would expect from other resources.

Updated the README with more information about this as well.

## Tests
Added unit tests for error overrides.
Added/updated the integration tests for Clusters and Jobs.

- [x] `make test` passing
- [x] `make fmt` applied
- [x] relevant integration tests applied
### Improvements and Bug Fixes
* Introduce more specific exceptions, like `NotFound`, `AlreadyExists`,
`BadRequest`, `PermissionDenied`, `InternalError`, and others
([databricks#185](databricks#185),
[databricks#257](databricks#257)).
* Lock around field accessibility changes
([databricks#247](databricks#247)).
* Fix Changelog
([databricks#258](databricks#258)).
* Support post with no body for APIs
([databricks#262](databricks#262)).

API Changes:

* Changed `cancelRefresh()` method for
`workspaceClient.lakehouseMonitors()` service with new required argument
order.
* Changed `create()` method for `workspaceClient.lakehouseMonitors()`
service with new required argument order.
* Changed `delete()` method for `workspaceClient.lakehouseMonitors()`
service with new required argument order.
* Changed `get()` method for `workspaceClient.lakehouseMonitors()`
service with new required argument order.
* Changed `getRefresh()` method for
`workspaceClient.lakehouseMonitors()` service with new required argument
order.
* Changed `listRefreshes()` method for
`workspaceClient.lakehouseMonitors()` service with new required argument
order.
* Changed `runRefresh()` method for
`workspaceClient.lakehouseMonitors()` service with new required argument
order.
* Changed `update()` method for `workspaceClient.lakehouseMonitors()`
service with new required argument order.
* Removed `com.databricks.sdk.service.catalog.AzureManagedIdentity`
class.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.CancelRefreshRequest`.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.CancelRefreshRequest`.
* Changed `customMetrics` field for
`com.databricks.sdk.service.catalog.CreateMonitor` to
`com.databricks.sdk.service.catalog.MonitorMetricList` class.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.CreateMonitor`.
* Changed `inferenceLog` field for
`com.databricks.sdk.service.catalog.CreateMonitor` to
`com.databricks.sdk.service.catalog.MonitorInferenceLog` class.
* Changed `notifications` field for
`com.databricks.sdk.service.catalog.CreateMonitor` to
`com.databricks.sdk.service.catalog.MonitorNotifications` class.
* Changed `snapshot` field for
`com.databricks.sdk.service.catalog.CreateMonitor` to `Object` class.
* Changed `timeSeries` field for
`com.databricks.sdk.service.catalog.CreateMonitor` to
`com.databricks.sdk.service.catalog.MonitorTimeSeries` class.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.CreateMonitor`.
* Changed `azureManagedIdentity` field for
`com.databricks.sdk.service.catalog.CreateStorageCredential` to
`com.databricks.sdk.service.catalog.AzureManagedIdentityRequest` class.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.GetLakehouseMonitorRequest`.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.GetLakehouseMonitorRequest`.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.GetRefreshRequest`.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.GetRefreshRequest`.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.ListRefreshesRequest`.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.ListRefreshesRequest`.
* Changed `quartzCronExpression` field for
`com.databricks.sdk.service.catalog.MonitorCronSchedule` to be required.
* Changed `timezoneId` field for
`com.databricks.sdk.service.catalog.MonitorCronSchedule` to be required.
* Removed `com.databricks.sdk.service.catalog.MonitorCustomMetric`
class.
* Removed `com.databricks.sdk.service.catalog.MonitorCustomMetricType`
class.
* Removed `com.databricks.sdk.service.catalog.MonitorDestinations`
class.
* Removed
`com.databricks.sdk.service.catalog.MonitorInferenceLogProfileType`
class.
* Removed
`com.databricks.sdk.service.catalog.MonitorInferenceLogProfileTypeProblemType`
class.
* Changed `customMetrics` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to
`com.databricks.sdk.service.catalog.MonitorMetricList` class.
* Changed `driftMetricsTableName` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to be required.
* Changed `inferenceLog` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to
`com.databricks.sdk.service.catalog.MonitorInferenceLog` class.
* Changed `monitorVersion` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to be required.
* Changed `notifications` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to
`com.databricks.sdk.service.catalog.MonitorNotifications` class.
* Changed `profileMetricsTableName` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to be required.
* Changed `snapshot` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to `Object` class.
* Changed `status` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to be required.
* Changed `tableName` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to be required.
* Changed `timeSeries` field for
`com.databricks.sdk.service.catalog.MonitorInfo` to
`com.databricks.sdk.service.catalog.MonitorTimeSeries` class.
* Removed
`com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class.
* Changed `refreshId` field for
`com.databricks.sdk.service.catalog.MonitorRefreshInfo` to be required.
* Changed `startTimeMs` field for
`com.databricks.sdk.service.catalog.MonitorRefreshInfo` to be required.
* Changed `state` field for
`com.databricks.sdk.service.catalog.MonitorRefreshInfo` to be required.
* Added `trigger` field for
`com.databricks.sdk.service.catalog.MonitorRefreshInfo`.
 * Removed `Object` class.
* Removed
`com.databricks.sdk.service.catalog.MonitorTimeSeriesProfileType` class.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.RunRefreshRequest`.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.RunRefreshRequest`.
* Changed `azureManagedIdentity` field for
`com.databricks.sdk.service.catalog.StorageCredentialInfo` to
`com.databricks.sdk.service.catalog.AzureManagedIdentityResponse` class.
* Removed `name` field for
`com.databricks.sdk.service.catalog.TableRowFilter`.
* Added `functionName` field for
`com.databricks.sdk.service.catalog.TableRowFilter`.
* Changed `customMetrics` field for
`com.databricks.sdk.service.catalog.UpdateMonitor` to
`com.databricks.sdk.service.catalog.MonitorMetricList` class.
* Removed `fullName` field for
`com.databricks.sdk.service.catalog.UpdateMonitor`.
* Changed `inferenceLog` field for
`com.databricks.sdk.service.catalog.UpdateMonitor` to
`com.databricks.sdk.service.catalog.MonitorInferenceLog` class.
* Changed `notifications` field for
`com.databricks.sdk.service.catalog.UpdateMonitor` to
`com.databricks.sdk.service.catalog.MonitorNotifications` class.
* Changed `snapshot` field for
`com.databricks.sdk.service.catalog.UpdateMonitor` to `Object` class.
* Changed `timeSeries` field for
`com.databricks.sdk.service.catalog.UpdateMonitor` to
`com.databricks.sdk.service.catalog.MonitorTimeSeries` class.
* Added `tableName` field for
`com.databricks.sdk.service.catalog.UpdateMonitor`.
* Changed `azureManagedIdentity` field for
`com.databricks.sdk.service.catalog.UpdateStorageCredential` to
`com.databricks.sdk.service.catalog.AzureManagedIdentityResponse` class.
* Changed `azureManagedIdentity` field for
`com.databricks.sdk.service.catalog.ValidateStorageCredential` to
`com.databricks.sdk.service.catalog.AzureManagedIdentityRequest` class.
* Removed `operation` field for
`com.databricks.sdk.service.catalog.ValidationResult`.
* Added `awsOperation` field for
`com.databricks.sdk.service.catalog.ValidationResult`.
* Added `azureOperation` field for
`com.databricks.sdk.service.catalog.ValidationResult`.
* Added `gcpOperation` field for
`com.databricks.sdk.service.catalog.ValidationResult`.
* Removed `com.databricks.sdk.service.catalog.ValidationResultOperation`
class.
* Added `com.databricks.sdk.service.catalog.AzureManagedIdentityRequest`
class.
* Added
`com.databricks.sdk.service.catalog.AzureManagedIdentityResponse` class.
 * Added `com.databricks.sdk.service.catalog.MonitorDestination` class.
 * Added `com.databricks.sdk.service.catalog.MonitorInferenceLog` class.
* Added
`com.databricks.sdk.service.catalog.MonitorInferenceLogProblemType`
class.
 * Added `com.databricks.sdk.service.catalog.MonitorMetric` class.
 * Added `com.databricks.sdk.service.catalog.MonitorMetricType` class.
* Added `com.databricks.sdk.service.catalog.MonitorNotifications` class.
* Added `com.databricks.sdk.service.catalog.MonitorRefreshInfoTrigger`
class.
 * Added `Object` class.
 * Added `com.databricks.sdk.service.catalog.MonitorTimeSeries` class.
* Added
`com.databricks.sdk.service.catalog.ValidationResultAwsOperation` class.
* Added
`com.databricks.sdk.service.catalog.ValidationResultAzureOperation`
class.
* Added
`com.databricks.sdk.service.catalog.ValidationResultGcpOperation` class.
* Added `cloneFrom` field for
`com.databricks.sdk.service.compute.ClusterSpec`.
 * Removed `com.databricks.sdk.service.compute.ComputeSpec` class.
 * Removed `com.databricks.sdk.service.compute.ComputeSpecKind` class.
* Added `cloneFrom` field for
`com.databricks.sdk.service.compute.CreateCluster`.
* Added `cloneFrom` field for
`com.databricks.sdk.service.compute.EditCluster`.
 * Added `com.databricks.sdk.service.compute.CloneCluster` class.
 * Added `com.databricks.sdk.service.compute.Environment` class.
* Changed `update()` method for `accountClient.workspaceAssignment()`
service to return `com.databricks.sdk.service.iam.PermissionAssignment`
class.
 * Removed `Object` class.
* Removed `computeKey` field for
`com.databricks.sdk.service.jobs.ClusterSpec`.
* Removed `compute` field for
`com.databricks.sdk.service.jobs.CreateJob`.
* Added `environments` field for
`com.databricks.sdk.service.jobs.CreateJob`.
 * Removed `com.databricks.sdk.service.jobs.JobCompute` class.
* Removed `compute` field for
`com.databricks.sdk.service.jobs.JobSettings`.
* Added `environments` field for
`com.databricks.sdk.service.jobs.JobSettings`.
* Removed `computeKey` field for
`com.databricks.sdk.service.jobs.RunTask`.
* Removed `com.databricks.sdk.service.jobs.TableTriggerConfiguration`
class.
* Removed `computeKey` field for `com.databricks.sdk.service.jobs.Task`.
* Added `environmentKey` field for
`com.databricks.sdk.service.jobs.Task`.
* Changed `table` field for
`com.databricks.sdk.service.jobs.TriggerSettings` to
`com.databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` class.
* Changed `tableUpdate` field for
`com.databricks.sdk.service.jobs.TriggerSettings` to
`com.databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` class.
 * Added `com.databricks.sdk.service.jobs.JobEnvironment` class.
* Added
`com.databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` class.
 * Added `com.databricks.sdk.service.marketplace` package.

OpenAPI SHA: 94684175b8bd65f8701f89729351f8069e8309c9, Date: 2024-04-11
…bricks#264)

## Changes
<!-- Summary of your changes that are easy to understand -->
This is a duplicate of databricks#249. Had to recreate because of commit signing
issues.
## Changes
After migrating GCP to a new E2 account, several private preview APIs
need to be enabled before we can run integration tests on them. This PR
disables those tests until then (PrivateAccessIT, VPCEndpointsIT).

Also, I noticed that QueriesIT was very slow due to using a very small
page size (2), resulting in it making 1000s of requests serially. This
caused integration tests to take over 2 hours to complete. I increased
this to 1000 which should cause the test to finish a bit more quickly
(on the order of minutes).

## Tests
<!-- How is this tested? -->
## Changes
Currently, non-paginated list APIs in the Java SDK simply return the
field in the response with the listed items. However, when there are no
items in the collection being listed. This PR changes all list APIs to
return a Paginator so that the API response is never `null`.

Resolves databricks#197.

## Tests
Tested listing clusters using an SP with no access to any clusters,
which passed:
```
10:12 [DEBUG] > GET /api/2.0/clusters/list
< 200 OK
< { }
```
## Changes
A secret was committed to the repo in an example. This PR removes it.
The underlying secret has been revoked so it is not usable any longer.

## Tests
<!-- How is this tested? -->
## Changes
databricks#266 unintentionally broke one-shot list APIs because the single page
would never be fetched. This PR forces that first page to be fetched,
restoring the original behavior.

## Tests

- [x] Ran SecretsIT locally, and it worked.
## Changes
<!-- Summary of your changes that are easy to understand -->
Updating SDK to latest OpenAPI specification.

## Tests
<!-- How is this tested? -->
Unit tests, will run nightly on the release PR.
auto-merge was automatically disabled April 23, 2024 08:11

Head branch was pushed to by a user without write access

github-merge-queue bot pushed a commit that referenced this pull request Apr 23, 2024
## Changes
<!-- Summary of your changes that are easy to understand -->
Duplicate of #242. Re-created to sign commits. (This PR adds a way to
provide proxy configs in the SDK directly, ability to pick up the system
properties and authentication for proxy via basic or negotiate-kerberos
schemes)

## Tests
<!-- How is this tested? -->
Documented in #242 (These changes were tested in an environment
consisting of a proxy server with kerberos authentication hosted on
Ubuntu and Windows VMs. The detailed testing details are documented
here:
https://docs.google.com/document/d/1Zxlfx-R_JytFqMZfQMBfSQ8mj4W-aHa60vw4KdjUkEc/edit)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants