-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PECO-1542] Add support for proxy #242
Commits on Apr 23, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 452aaba - Browse repository at this point
Copy the full SHA 452aabaView commit details -
Configuration menu - View commit details
-
Copy full SHA for f08250a - Browse repository at this point
Copy the full SHA f08250aView commit details -
Add basic support for HTTP proxies (databricks#241)
## Changes This PR sets `useSystemProperties()` when building the Commons HTTP client. This allows users to configure an HTTP proxy by setting the `https.proxyHost` and `https.proxyPort` parameters (see https://docs.oracle.com/javase/6/docs/technotes/guides/net/proxies.html). Closes databricks#111. ## Tests Added an example. Started the HTTP proxy from databricks/databricks-sdk-go#825, created a run configuration for the new example that sets `https.proxyHost` to `localhost` and `https.proxyPort` to `8443`, and run. I see the proxy server handling a request and the example completes successfully.
Configuration menu - View commit details
-
Copy full SHA for 05c2638 - Browse repository at this point
Copy the full SHA 05c2638View commit details -
Configuration menu - View commit details
-
Copy full SHA for bc4f65e - Browse repository at this point
Copy the full SHA bc4f65eView commit details -
Update SDK to latest OpenAPI spec (databricks#245)
## Changes <!-- Summary of your changes that are easy to understand --> Generating SDK to latest OpenAPI specification ## Tests <!-- How is this tested? --> Unit tests, nightly tests will run before release
Configuration menu - View commit details
-
Copy full SHA for e40a1de - Browse repository at this point
Copy the full SHA e40a1deView commit details -
Release v0.20.0 (databricks#246)
## 0.20.0 ### Features and Improvements * Added basic support for HTTP proxies ([databricks#241](databricks#241)). * Fixed getWorkspaceClient() for GCP ([databricks#224](databricks#224)). * Note: Backwards incompatible changes - Settings are now nested, please see the API changes below. ### Internal Changes * Reading headers should be done in a case-insensitive manner ([databricks#235](databricks#235)). * Added integration tests for the Files API ([databricks#236](databricks#236)). * Supported subservices ([databricks#237](databricks#237)). * Handled empty types in the Java SDK ([databricks#239](databricks#239)). * Added tokei.rs lines of code badge ([databricks#243](databricks#243)). * Updated SDK to latest OpenAPI spec ([databricks#245](databricks#245)). ### API Changes: * Added the following services: - `workspaceClient.permissionMigration()` - `workspaceClient.automaticClusterUpdate()` - `workspaceClient.cspEnablement()` - `accountClient.cspEnablementAccount()` - `workspaceClient.defaultNamespace()` - `workspaceClient.esmEnablement()` - `accountClient.esmEnablementAccount()` - `accountClient.personalCompute()` - `workspaceClient.restrictWorkspaceAdmins()` * Added the following classes: - `com.databricks.sdk.service.iam.PermissionMigrationRequest` - `com.databricks.sdk.service.iam.PermissionMigrationResponse` - `com.databricks.sdk.service.settings.AutomaticClusterUpdateSetting` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessage` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessageEnablementDetails` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindow` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowDayOfWeek` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWeekDayBasedSchedule` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWeekDayFrequency` - `com.databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWindowStartTime` - `com.databricks.sdk.service.settings.ComplianceStandard` - `com.databricks.sdk.service.settings.CspEnablement` - `com.databricks.sdk.service.settings.CspEnablementAccount` - `com.databricks.sdk.service.settings.CspEnablementAccountSetting` - `com.databricks.sdk.service.settings.CspEnablementSetting` - `com.databricks.sdk.service.settings.DeleteDefaultNamespaceRequest` - `com.databricks.sdk.service.settings.DeletePersonalComputeRequest` - `com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminRequest` - `com.databricks.sdk.service.settings.EsmEnablement` - `com.databricks.sdk.service.settings.EsmEnablementAccount` - `com.databricks.sdk.service.settings.EsmEnablementAccountSetting` - `com.databricks.sdk.service.settings.EsmEnablementSetting` - `com.databricks.sdk.service.settings.GetAutomaticClusterUpdateRequest` - `com.databricks.sdk.service.settings.GetCspEnablementAccountRequest` - `com.databricks.sdk.service.settings.GetCspEnablementRequest` - `com.databricks.sdk.service.settings.GetDefaultNamespaceRequest` - `com.databricks.sdk.service.settings.GetEsmEnablementAccountRequest` - `com.databricks.sdk.service.settings.GetEsmEnablementRequest` - `com.databricks.sdk.service.settings.GetPersonalComputeRequest` - `com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminRequest` - `com.databricks.sdk.service.settings.NccAwsStableIpRule` - `com.databricks.sdk.service.settings.UpdateAutomaticClusterUpdateSettingRequest` - `com.databricks.sdk.service.settings.UpdateCspEnablementAccountSettingRequest` - `com.databricks.sdk.service.settings.UpdateCspEnablementSettingRequest` - `com.databricks.sdk.service.settings.UpdateEsmEnablementAccountSettingRequest` - `com.databricks.sdk.service.settings.UpdateEsmEnablementSettingRequest` * Removed the follogin classes: - `com.databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest` - `com.databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` - `com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest` - `com.databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest` - `com.databricks.sdk.service.settings.GetPersonalComputeSettingRequest` - `com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest` * Changed `version` field for `com.databricks.sdk.service.serving.AppManifest` to `com.databricks.sdk.service.serving.AnyValue` class. * Removed `deletePersonalComputeSetting()`, `getPersonalComputeSetting()` and `updatePersonalComputeSetting()` method for `accountClient.settings()` service. * Removed `deleteDefaultNamespaceSetting()`, `deleteRestrictWorkspaceAdminsSetting()`, `getDefaultNamespaceSetting()`, `getRestrictWorkspaceAdminsSetting()`, `updateDefaultNamespaceSetting()` and `updateRestrictWorkspaceAdminsSetting()` method for `workspaceClient.settings()` service. * Added `awsStableIpRule` field for `com.databricks.sdk.service.settings.NccEgressDefaultRules`. * Added `indexName` field for `com.databricks.sdk.service.vectorsearch.DeleteDataVectorIndexRequest`. * Added `embeddingModelEndpointName` field for `com.databricks.sdk.service.vectorsearch.EmbeddingSourceColumn`. * Added `indexName` field for `com.databricks.sdk.service.vectorsearch.UpsertDataVectorIndexRequest`. * Added `deltaSyncIndexSpec` field for `com.databricks.sdk.service.vectorsearch.VectorIndex`. * Added `directAccessIndexSpec` field for `com.databricks.sdk.service.vectorsearch.VectorIndex`. * Changed `deleteEndpoint()`, `createIndex()`, `deleteDataVectorIndex()` and `upsertDataVectorIndex()` method for `workspaceClient.vectorSearchEndpoints()` service with new required argument order. * Changed `endpointName` field for `com.databricks.sdk.service.vectorsearch.CreateVectorIndexRequest` to be required. * Removed `planningPhases` field for `com.databricks.sdk.service.sql.QueryMetrics`. * Removed `name` field for `com.databricks.sdk.service.vectorsearch.DeleteDataVectorIndexRequest`. * Removed `name` field for `com.databricks.sdk.service.vectorsearch.DeleteEndpointRequest`. * Removed `com.databricks.sdk.service.vectorsearch.EmbeddingConfig` class. * Removed `embeddingConfig` field for `com.databricks.sdk.service.vectorsearch.EmbeddingSourceColumn`. * Removed `name` field for `com.databricks.sdk.service.vectorsearch.UpsertDataVectorIndexRequest`. * Removed `deltaSyncVectorIndexSpec` field for `com.databricks.sdk.service.vectorsearch.VectorIndex`. * Removed `directAccessVectorIndexSpec` field for `com.databricks.sdk.service.vectorsearch.VectorIndex`. OpenAPI SHA: d855b30f25a06fe84f25214efa20e7f1fffcdf9e, Date: 2024-03-04
Configuration menu - View commit details
-
Copy full SHA for bfbc121 - Browse repository at this point
Copy the full SHA bfbc121View commit details -
Release v0.21.0 (databricks#250)
API Changes: * Changed `list()` method for `workspaceClient.catalogs()` service to require request of `com.databricks.sdk.service.catalog.ListCatalogsRequest` class. * Changed `create()` method for `workspaceClient.onlineTables()` service . New request type is `com.databricks.sdk.service.catalog.CreateOnlineTableRequest` class. * Removed `com.databricks.sdk.service.catalog.AwsIamRole` class. * Changed `notifications` field for `com.databricks.sdk.service.catalog.CreateMonitor` to `com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class. * Changed `awsIamRole` field for `com.databricks.sdk.service.catalog.CreateStorageCredential` to `com.databricks.sdk.service.catalog.AwsIamRoleRequest` class. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.ExternalLocationInfo`. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.FunctionInfo`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetCatalogRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetExternalLocationRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetFunctionRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetModelVersionRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetRegisteredModelRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetSchemaRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.GetTableRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListExternalLocationsRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListFunctionsRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListModelVersionsRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListRegisteredModelsRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListSchemasRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListTablesRequest`. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ListVolumesRequest`. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.ModelVersionInfo`. * Changed `notifications` field for `com.databricks.sdk.service.catalog.MonitorInfo` to `com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class. * Added `includeBrowse` field for `com.databricks.sdk.service.catalog.ReadVolumeRequest`. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.RegisteredModelInfo`. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.SchemaInfo`. * Changed `awsIamRole` field for `com.databricks.sdk.service.catalog.StorageCredentialInfo` to `com.databricks.sdk.service.catalog.AwsIamRoleResponse` class. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.TableInfo`. * Changed `notifications` field for `com.databricks.sdk.service.catalog.UpdateMonitor` to `com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class. * Changed `awsIamRole` field for `com.databricks.sdk.service.catalog.UpdateStorageCredential` to `com.databricks.sdk.service.catalog.AwsIamRoleRequest` class. * Changed `awsIamRole` field for `com.databricks.sdk.service.catalog.ValidateStorageCredential` to `com.databricks.sdk.service.catalog.AwsIamRoleRequest` class. * Removed `com.databricks.sdk.service.catalog.ViewData` class. * Added `browseOnly` field for `com.databricks.sdk.service.catalog.VolumeInfo`. * Added `com.databricks.sdk.service.catalog.AwsIamRoleRequest` class. * Added `com.databricks.sdk.service.catalog.AwsIamRoleResponse` class. * Added `com.databricks.sdk.service.catalog.CreateOnlineTableRequest` class. * Added `com.databricks.sdk.service.catalog.ListCatalogsRequest` class. * Changed `publish()` method for `workspaceClient.lakeview()` service to return `com.databricks.sdk.service.dashboards.PublishedDashboard` class. * Added `create()` method for `workspaceClient.lakeview()` service. * Added `get()` method for `workspaceClient.lakeview()` service. * Added `getPublished()` method for `workspaceClient.lakeview()` service. * Added `trash()` method for `workspaceClient.lakeview()` service. * Added `update()` method for `workspaceClient.lakeview()` service. * Removed `Object` class. * Added `com.databricks.sdk.service.dashboards.CreateDashboardRequest` class. * Added `com.databricks.sdk.service.dashboards.Dashboard` class. * Added `com.databricks.sdk.service.dashboards.GetDashboardRequest` class. * Added `com.databricks.sdk.service.dashboards.GetPublishedDashboardRequest` class. * Added `com.databricks.sdk.service.dashboards.LifecycleState` class. * Added `com.databricks.sdk.service.dashboards.PublishedDashboard` class. * Added `com.databricks.sdk.service.dashboards.TrashDashboardRequest` class. * Added `Object` class. * Added `com.databricks.sdk.service.dashboards.UpdateDashboardRequest` class. * Added `autoCaptureConfig` field for `com.databricks.sdk.service.serving.EndpointPendingConfig`. * Changed `get()` method for `workspaceClient.automaticClusterUpdate()` service . New request type is `com.databricks.sdk.service.settings.GetAutomaticClusterUpdateSettingRequest` class. * Changed `get()` method for `workspaceClient.cspEnablement()` service . New request type is `com.databricks.sdk.service.settings.GetCspEnablementSettingRequest` class. * Changed `get()` method for `accountClient.cspEnablementAccount()` service . New request type is `com.databricks.sdk.service.settings.GetCspEnablementAccountSettingRequest` class. * Changed `delete()` method for `workspaceClient.defaultNamespace()` service . New request type is `com.databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest` class. * Changed `get()` method for `workspaceClient.defaultNamespace()` service . New request type is `com.databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest` class. * Changed `get()` method for `workspaceClient.esmEnablement()` service . New request type is `com.databricks.sdk.service.settings.GetEsmEnablementSettingRequest` class. * Changed `get()` method for `accountClient.esmEnablementAccount()` service . New request type is `com.databricks.sdk.service.settings.GetEsmEnablementAccountSettingRequest` class. * Changed `get()` method for `workspaceClient.ipAccessLists()` service . New request type is `com.databricks.sdk.service.settings.GetIpAccessList` class. * Changed `delete()` method for `accountClient.personalCompute()` service . New request type is `com.databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` class. * Changed `get()` method for `accountClient.personalCompute()` service . New request type is `com.databricks.sdk.service.settings.GetPersonalComputeSettingRequest` class. * Changed `delete()` method for `workspaceClient.restrictWorkspaceAdmins()` service . New request type is `com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest` class. * Changed `get()` method for `workspaceClient.restrictWorkspaceAdmins()` service . New request type is `com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest` class. * Removed `com.databricks.sdk.service.settings.DeleteDefaultNamespaceRequest` class. * Removed `com.databricks.sdk.service.settings.DeletePersonalComputeRequest` class. * Removed `com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminRequest` class. * Removed `com.databricks.sdk.service.settings.GetAutomaticClusterUpdateRequest` class. * Removed `com.databricks.sdk.service.settings.GetCspEnablementAccountRequest` class. * Removed `com.databricks.sdk.service.settings.GetCspEnablementRequest` class. * Removed `com.databricks.sdk.service.settings.GetDefaultNamespaceRequest` class. * Removed `com.databricks.sdk.service.settings.GetEsmEnablementAccountRequest` class. * Removed `com.databricks.sdk.service.settings.GetEsmEnablementRequest` class. * Removed `com.databricks.sdk.service.settings.GetIpAccessListRequest` class. * Removed `com.databricks.sdk.service.settings.GetPersonalComputeRequest` class. * Removed `com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminRequest` class. * Added `com.databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest` class. * Added `com.databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` class. * Added `com.databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetAutomaticClusterUpdateSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetCspEnablementAccountSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetCspEnablementSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetEsmEnablementAccountSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetEsmEnablementSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetIpAccessList` class. * Added `com.databricks.sdk.service.settings.GetPersonalComputeSettingRequest` class. * Added `com.databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest` class. * Changed `dataObjectType` field for `com.databricks.sdk.service.sharing.SharedDataObject` to `com.databricks.sdk.service.sharing.SharedDataObjectDataObjectType` class. * Added `content` field for `com.databricks.sdk.service.sharing.SharedDataObject`. * Added `com.databricks.sdk.service.sharing.SharedDataObjectDataObjectType` class. * Added `embeddingSourceColumns` field for `com.databricks.sdk.service.vectorsearch.DirectAccessVectorIndexSpec`. * Added `scoreThreshold` field for `com.databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`. OpenAPI SHA: 93763b0d7ae908520c229c786fff28b8fd623261, Date: 2024-03-20
Configuration menu - View commit details
-
Copy full SHA for 6447fc7 - Browse repository at this point
Copy the full SHA 6447fc7View commit details -
Configuration menu - View commit details
-
Copy full SHA for 6cea2eb - Browse repository at this point
Copy the full SHA 6cea2ebView commit details -
Configuration menu - View commit details
-
Copy full SHA for bbc41ae - Browse repository at this point
Copy the full SHA bbc41aeView commit details -
Configuration menu - View commit details
-
Copy full SHA for 0a4f113 - Browse repository at this point
Copy the full SHA 0a4f113View commit details -
Configuration menu - View commit details
-
Copy full SHA for 126bf73 - Browse repository at this point
Copy the full SHA 126bf73View commit details -
Configuration menu - View commit details
-
Copy full SHA for be52b65 - Browse repository at this point
Copy the full SHA be52b65View commit details -
Configuration menu - View commit details
-
Copy full SHA for 3b79ab4 - Browse repository at this point
Copy the full SHA 3b79ab4View commit details -
Configuration menu - View commit details
-
Copy full SHA for 480e55e - Browse repository at this point
Copy the full SHA 480e55eView commit details -
Properly escape multi-segment path parameters (databricks#252)
## Changes Ports databricks/databricks-sdk-go#869 to Java SDK. Currently, path parameters are directly interpolated into the request URL without escaping. This means that characters like `/`, `?` and `#` will not be percent-encoded and will affect the semantics of the URL, starting a new path segment, query parameters, or fragment, respectively. This means that it is impossible for users of the Files API to upload/download objects that contain `?` or `#` in their name. `/` is allowed in the path of the Files API, so it does not need to be escaped. The Files API is currently marked with `x-databricks-multi-segment`, indicating that it should be permitted to have `/` characters but other characters need to be percent-encoded. This PR implements this. ## Tests - [x] Unit test for multi-segment path escaping behavior. - [x] Updated integration test to use # and ? symbols in the file name.
Configuration menu - View commit details
-
Copy full SHA for 243392f - Browse repository at this point
Copy the full SHA 243392fView commit details -
Release v0.22.0 (databricks#253)
Improvements and Bug Fixes * Properly escape multi-segment path parameters ([databricks#252](databricks#252)). API Changes: * Added `Migrate` and `Unpublish` methods for [w.Lakeview](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI) workspace-level service. * Added [dashboards.MigrateDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MigrateDashboardRequest). * Added [dashboards.UnpublishDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UnpublishDashboardRequest). * Added `Description`, `QueueDuration` and `RepairHistory` fields for [jobs.BaseRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun). * Added `ComputeKey` and `JobClusterKey` fields for [jobs.ClusterSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ClusterSpec). * Changed `Left`, `Op` and `Right` fields for [jobs.ConditionTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ConditionTask) to be required. * Changed `EditMode` field for [jobs.CreateJob](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob) to [jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode). * Replaced [jobs.CreateJobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobEditMode) to [jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode). * Changed `Url` field for [jobs.FileArrivalTriggerConfiguration](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#FileArrivalTriggerConfiguration) to be required. * Changed `ErrorMessageStats` field for [jobs.ForEachStats](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ForEachStats) to [jobs.ForEachTaskErrorMessageStatsList](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ForEachTaskErrorMessageStatsList). * Changed `NewCluster` field for [jobs.JobCluster](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCluster) to be required. * Changed `EditMode` field for [jobs.JobSettings](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings) to [jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode). * Replaced [jobs.JobSettingsEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsEditMode) by [jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode). * Changed `Metric`, `Op` and `Value` fields for [jobs.JobsHealthRule](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule) to be required. * Changed `RunType` field for [jobs.ListRunsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListRunsRequest) to [jobs.RunType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunType). * Removed [jobs.ListRunsRunType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListRunsRunType). * Removed [jobs.ParamPairs](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ParamPairs). * Changed `PipelineId` field for [jobs.PipelineTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PipelineTask) to be required. * Changed `EntryPoint` and `PackageName` fields for [jobs.PythonWheelTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PythonWheelTask) to be required. * Changed `JobParameters` field for [jobs.RepairRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RepairRun) to map[string]`string`. * Changed `BaseParameters` field for [jobs.ResolvedNotebookTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedNotebookTaskValues) to map[string]`string`. * Changed `Parameters` field for [jobs.ResolvedParamPairValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedParamPairValues) to map[string]`string`. * Changed `NamedParameters` field for [jobs.ResolvedPythonWheelTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedPythonWheelTaskValues) to map[string]`string`. * Removed `NamedParameters` field for [jobs.ResolvedRunJobTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedRunJobTaskValues). * Changed `Parameters` field for [jobs.ResolvedRunJobTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedRunJobTaskValues) to map[string]`string`. * Added `JobParameters` field for [jobs.ResolvedRunJobTaskValues](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ResolvedRunJobTaskValues). * Added `Description` field for [jobs.Run](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run). * Added `QueueDuration` field for [jobs.Run](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run). * Changed `Op` field for [jobs.RunConditionTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunConditionTask) to [jobs.ConditionTaskOp](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ConditionTaskOp). * Removed [jobs.RunConditionTaskOp](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunConditionTaskOp). * Changed `Inputs` and `Task` fields for [jobs.RunForEachTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunForEachTask) to be required. * Changed `JobParameters` field for [jobs.RunJobTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask) to map[string]`string`. * Added `DbtCommands`, `JarParams`, `NotebookParams`, `PipelineParams`, `PythonNamedParams`, `PythonParams`, `SparkSubmitParams` and `SqlParams` fields for [jobs.RunJobTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask). * Changed `JobParameters` field for [jobs.RunNow](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunNow) to map[string]`string`. * Added `Info` field for [jobs.RunOutput](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput). * Removed `JobParameters` field for [jobs.RunParameters](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunParameters). * Changed `TaskKey` field for [jobs.RunTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask) to be required. * Added `ComputeKey`,`EmailNotifications`, `JobClusterKey`, `NotificatioSettings`, `RunDuration`, `RunPageUrl`, `TimeoutSeconds` and `WebhookNotifications` fields for [jobs.RunTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask). * Added `EndpointId` field for [jobs.SqlQueryOutput](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SqlQueryOutput). * Added `ConditionTask` field for [jobs.SubmitRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun). * Added `DbtCommands`, `JarParams`, `NotebookParams`, `PipelineParams`, `PythonNamedParams`, `PythonParams`, `SparkSubmitParams` and `SqlParams` field for [jobs.SubmitRun](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun). * Added `Description` field for [jobs.SubmitTask](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask). * Added `DisableAutoOptimization` field for [jobs.Task](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task). * Added `NoAlertForSkippedRuns` field for [jobs.TaskEmailNotifications](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications). * Added `TableUpdate` field for [jobs.TriggerSettings](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings). * Changed `Id` field for [jobs.Webhook](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Webhook) to be required. * Changed `OnDurationWarningThresholdExceeded` field for [jobs.WebhookNotifications](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications) to [jobs.WebhookList](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookList). * Removed [jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotificationsOnDurationWarningThresholdExceededItem). * Added [jobs.JobEditMode](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEditMode). * Removed [serving.AwsBedrockConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AwsBedrockConfig). * Removed [serving.AwsBedrockConfigBedrockProvider](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AwsBedrockConfigBedrockProvider). * Removed `AwsBedrockConfig` field for [serving.ExternalModel](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel). * Added `AmazonBedrockConfig` field for [serving.ExternalModel](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel). * Added [serving.AmazonBedrockConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AmazonBedrockConfig). * Added [serving.AmazonBedrockConfigBedrockProvider](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AmazonBedrockConfigBedrockProvider). * Changed `Get` method for [w.IpAccessLists](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#IpAccessListsAPI) workspace-level service . New request type is [settings.GetIpAccessListRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetIpAccessListRequest). * Renamed [settings.GetIpAccessList](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetIpAccessList) to [settings.GetIpAccessListRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetIpAccessListRequest). OpenAPI SHA: d38528c3e47dd81c9bdbd918272a3e49d36e09ce, Date: 2024-03-27
Configuration menu - View commit details
-
Copy full SHA for bb0adcc - Browse repository at this point
Copy the full SHA bb0adccView commit details -
Lock around field accessibility changes (databricks#247)
Adjusts the field accessibility changes so they are protected behind a lock. Epic was running into this when we concurrently created WorkspaceClients and they would often read properties that another thread had re-set to unaccessible. ## Changes Synchronize on the `field` object before we mutate and read it to ensure accessibility maintains throughout the code block ## Tests No new tests. Nothing new added so existing tests should ensure logic is correct for config reading. Testing multithreaded races in this case would prove difficult.
Configuration menu - View commit details
-
Copy full SHA for 70847f8 - Browse repository at this point
Copy the full SHA 70847f8View commit details -
Introduce more specific exceptions, like
NotFound
,AlreadyExists
,…… `BadRequest`, `PermissionDenied`, `InternalError`, and others (databricks#185) See implementations in other SDKs: - Go: databricks/databricks-sdk-go#682 - Python: databricks/databricks-sdk-py#376 --------- Co-authored-by: Miles Yucht <miles@databricks.com> Co-authored-by: Tanmay Rustagi <tanmay.rustagi@databricks.com>
Configuration menu - View commit details
-
Copy full SHA for a21882f - Browse repository at this point
Copy the full SHA a21882fView commit details -
Fix Changelog (databricks#258)
## Changes Fix Changelog ## Tests N/A
Configuration menu - View commit details
-
Copy full SHA for 84b9fa2 - Browse repository at this point
Copy the full SHA 84b9fa2View commit details -
Support post with no body for APIs (databricks#262)
## Changes The upcoming Marketplace API introduces an API to create an analytics dashboard. This creation request is kind of a singleton, as it has no parameters. To support this, we introduce a `POST()` method with no body specified so that this resource can be created. ## Tests <!-- How is this tested? -->
Configuration menu - View commit details
-
Copy full SHA for 972b304 - Browse repository at this point
Copy the full SHA 972b304View commit details -
Override INVALID_PARAMETER_VALUE on fetching non-existent job/cluster (…
…databricks#257) ## Changes Ports databricks/databricks-sdk-go#864 to the Java SDK. Most services use `RESOURCE_DOES_NOT_EXIST` error code with 404 status code to indicate that a resource doesn't exist. However, for legacy reasons, Jobs and Clusters services use `INVALID_PARAMETER_VALUE` error code with 400 status code instead. This makes tools like Terraform and UCX difficult to maintain, as these services need different error handling logic. However, we can't change these behaviors as customers already depend on the raw HTTP response status & contents. This PR corrects these errors in the SDK itself. SDK users can then do ```java try { BaseJob job = w.jobs().get("123"); } catch (ResourceDoesNotExist e) { ... } ``` just as you would expect from other resources. Updated the README with more information about this as well. ## Tests Added unit tests for error overrides. Added/updated the integration tests for Clusters and Jobs. - [x] `make test` passing - [x] `make fmt` applied - [x] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 4ab4fd0 - Browse repository at this point
Copy the full SHA 4ab4fd0View commit details -
Release v0.23.0 (databricks#263)
### Improvements and Bug Fixes * Introduce more specific exceptions, like `NotFound`, `AlreadyExists`, `BadRequest`, `PermissionDenied`, `InternalError`, and others ([databricks#185](databricks#185), [databricks#257](databricks#257)). * Lock around field accessibility changes ([databricks#247](databricks#247)). * Fix Changelog ([databricks#258](databricks#258)). * Support post with no body for APIs ([databricks#262](databricks#262)). API Changes: * Changed `cancelRefresh()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `create()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `delete()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `get()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `getRefresh()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `listRefreshes()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `runRefresh()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Changed `update()` method for `workspaceClient.lakehouseMonitors()` service with new required argument order. * Removed `com.databricks.sdk.service.catalog.AzureManagedIdentity` class. * Removed `fullName` field for `com.databricks.sdk.service.catalog.CancelRefreshRequest`. * Added `tableName` field for `com.databricks.sdk.service.catalog.CancelRefreshRequest`. * Changed `customMetrics` field for `com.databricks.sdk.service.catalog.CreateMonitor` to `com.databricks.sdk.service.catalog.MonitorMetricList` class. * Removed `fullName` field for `com.databricks.sdk.service.catalog.CreateMonitor`. * Changed `inferenceLog` field for `com.databricks.sdk.service.catalog.CreateMonitor` to `com.databricks.sdk.service.catalog.MonitorInferenceLog` class. * Changed `notifications` field for `com.databricks.sdk.service.catalog.CreateMonitor` to `com.databricks.sdk.service.catalog.MonitorNotifications` class. * Changed `snapshot` field for `com.databricks.sdk.service.catalog.CreateMonitor` to `Object` class. * Changed `timeSeries` field for `com.databricks.sdk.service.catalog.CreateMonitor` to `com.databricks.sdk.service.catalog.MonitorTimeSeries` class. * Added `tableName` field for `com.databricks.sdk.service.catalog.CreateMonitor`. * Changed `azureManagedIdentity` field for `com.databricks.sdk.service.catalog.CreateStorageCredential` to `com.databricks.sdk.service.catalog.AzureManagedIdentityRequest` class. * Removed `fullName` field for `com.databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`. * Added `tableName` field for `com.databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`. * Removed `fullName` field for `com.databricks.sdk.service.catalog.GetLakehouseMonitorRequest`. * Added `tableName` field for `com.databricks.sdk.service.catalog.GetLakehouseMonitorRequest`. * Removed `fullName` field for `com.databricks.sdk.service.catalog.GetRefreshRequest`. * Added `tableName` field for `com.databricks.sdk.service.catalog.GetRefreshRequest`. * Removed `fullName` field for `com.databricks.sdk.service.catalog.ListRefreshesRequest`. * Added `tableName` field for `com.databricks.sdk.service.catalog.ListRefreshesRequest`. * Changed `quartzCronExpression` field for `com.databricks.sdk.service.catalog.MonitorCronSchedule` to be required. * Changed `timezoneId` field for `com.databricks.sdk.service.catalog.MonitorCronSchedule` to be required. * Removed `com.databricks.sdk.service.catalog.MonitorCustomMetric` class. * Removed `com.databricks.sdk.service.catalog.MonitorCustomMetricType` class. * Removed `com.databricks.sdk.service.catalog.MonitorDestinations` class. * Removed `com.databricks.sdk.service.catalog.MonitorInferenceLogProfileType` class. * Removed `com.databricks.sdk.service.catalog.MonitorInferenceLogProfileTypeProblemType` class. * Changed `customMetrics` field for `com.databricks.sdk.service.catalog.MonitorInfo` to `com.databricks.sdk.service.catalog.MonitorMetricList` class. * Changed `driftMetricsTableName` field for `com.databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `inferenceLog` field for `com.databricks.sdk.service.catalog.MonitorInfo` to `com.databricks.sdk.service.catalog.MonitorInferenceLog` class. * Changed `monitorVersion` field for `com.databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `notifications` field for `com.databricks.sdk.service.catalog.MonitorInfo` to `com.databricks.sdk.service.catalog.MonitorNotifications` class. * Changed `profileMetricsTableName` field for `com.databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `snapshot` field for `com.databricks.sdk.service.catalog.MonitorInfo` to `Object` class. * Changed `status` field for `com.databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `tableName` field for `com.databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `timeSeries` field for `com.databricks.sdk.service.catalog.MonitorInfo` to `com.databricks.sdk.service.catalog.MonitorTimeSeries` class. * Removed `com.databricks.sdk.service.catalog.MonitorNotificationsConfig` class. * Changed `refreshId` field for `com.databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. * Changed `startTimeMs` field for `com.databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. * Changed `state` field for `com.databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. * Added `trigger` field for `com.databricks.sdk.service.catalog.MonitorRefreshInfo`. * Removed `Object` class. * Removed `com.databricks.sdk.service.catalog.MonitorTimeSeriesProfileType` class. * Removed `fullName` field for `com.databricks.sdk.service.catalog.RunRefreshRequest`. * Added `tableName` field for `com.databricks.sdk.service.catalog.RunRefreshRequest`. * Changed `azureManagedIdentity` field for `com.databricks.sdk.service.catalog.StorageCredentialInfo` to `com.databricks.sdk.service.catalog.AzureManagedIdentityResponse` class. * Removed `name` field for `com.databricks.sdk.service.catalog.TableRowFilter`. * Added `functionName` field for `com.databricks.sdk.service.catalog.TableRowFilter`. * Changed `customMetrics` field for `com.databricks.sdk.service.catalog.UpdateMonitor` to `com.databricks.sdk.service.catalog.MonitorMetricList` class. * Removed `fullName` field for `com.databricks.sdk.service.catalog.UpdateMonitor`. * Changed `inferenceLog` field for `com.databricks.sdk.service.catalog.UpdateMonitor` to `com.databricks.sdk.service.catalog.MonitorInferenceLog` class. * Changed `notifications` field for `com.databricks.sdk.service.catalog.UpdateMonitor` to `com.databricks.sdk.service.catalog.MonitorNotifications` class. * Changed `snapshot` field for `com.databricks.sdk.service.catalog.UpdateMonitor` to `Object` class. * Changed `timeSeries` field for `com.databricks.sdk.service.catalog.UpdateMonitor` to `com.databricks.sdk.service.catalog.MonitorTimeSeries` class. * Added `tableName` field for `com.databricks.sdk.service.catalog.UpdateMonitor`. * Changed `azureManagedIdentity` field for `com.databricks.sdk.service.catalog.UpdateStorageCredential` to `com.databricks.sdk.service.catalog.AzureManagedIdentityResponse` class. * Changed `azureManagedIdentity` field for `com.databricks.sdk.service.catalog.ValidateStorageCredential` to `com.databricks.sdk.service.catalog.AzureManagedIdentityRequest` class. * Removed `operation` field for `com.databricks.sdk.service.catalog.ValidationResult`. * Added `awsOperation` field for `com.databricks.sdk.service.catalog.ValidationResult`. * Added `azureOperation` field for `com.databricks.sdk.service.catalog.ValidationResult`. * Added `gcpOperation` field for `com.databricks.sdk.service.catalog.ValidationResult`. * Removed `com.databricks.sdk.service.catalog.ValidationResultOperation` class. * Added `com.databricks.sdk.service.catalog.AzureManagedIdentityRequest` class. * Added `com.databricks.sdk.service.catalog.AzureManagedIdentityResponse` class. * Added `com.databricks.sdk.service.catalog.MonitorDestination` class. * Added `com.databricks.sdk.service.catalog.MonitorInferenceLog` class. * Added `com.databricks.sdk.service.catalog.MonitorInferenceLogProblemType` class. * Added `com.databricks.sdk.service.catalog.MonitorMetric` class. * Added `com.databricks.sdk.service.catalog.MonitorMetricType` class. * Added `com.databricks.sdk.service.catalog.MonitorNotifications` class. * Added `com.databricks.sdk.service.catalog.MonitorRefreshInfoTrigger` class. * Added `Object` class. * Added `com.databricks.sdk.service.catalog.MonitorTimeSeries` class. * Added `com.databricks.sdk.service.catalog.ValidationResultAwsOperation` class. * Added `com.databricks.sdk.service.catalog.ValidationResultAzureOperation` class. * Added `com.databricks.sdk.service.catalog.ValidationResultGcpOperation` class. * Added `cloneFrom` field for `com.databricks.sdk.service.compute.ClusterSpec`. * Removed `com.databricks.sdk.service.compute.ComputeSpec` class. * Removed `com.databricks.sdk.service.compute.ComputeSpecKind` class. * Added `cloneFrom` field for `com.databricks.sdk.service.compute.CreateCluster`. * Added `cloneFrom` field for `com.databricks.sdk.service.compute.EditCluster`. * Added `com.databricks.sdk.service.compute.CloneCluster` class. * Added `com.databricks.sdk.service.compute.Environment` class. * Changed `update()` method for `accountClient.workspaceAssignment()` service to return `com.databricks.sdk.service.iam.PermissionAssignment` class. * Removed `Object` class. * Removed `computeKey` field for `com.databricks.sdk.service.jobs.ClusterSpec`. * Removed `compute` field for `com.databricks.sdk.service.jobs.CreateJob`. * Added `environments` field for `com.databricks.sdk.service.jobs.CreateJob`. * Removed `com.databricks.sdk.service.jobs.JobCompute` class. * Removed `compute` field for `com.databricks.sdk.service.jobs.JobSettings`. * Added `environments` field for `com.databricks.sdk.service.jobs.JobSettings`. * Removed `computeKey` field for `com.databricks.sdk.service.jobs.RunTask`. * Removed `com.databricks.sdk.service.jobs.TableTriggerConfiguration` class. * Removed `computeKey` field for `com.databricks.sdk.service.jobs.Task`. * Added `environmentKey` field for `com.databricks.sdk.service.jobs.Task`. * Changed `table` field for `com.databricks.sdk.service.jobs.TriggerSettings` to `com.databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` class. * Changed `tableUpdate` field for `com.databricks.sdk.service.jobs.TriggerSettings` to `com.databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` class. * Added `com.databricks.sdk.service.jobs.JobEnvironment` class. * Added `com.databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` class. * Added `com.databricks.sdk.service.marketplace` package. OpenAPI SHA: 94684175b8bd65f8701f89729351f8069e8309c9, Date: 2024-04-11
Configuration menu - View commit details
-
Copy full SHA for 2a57a7b - Browse repository at this point
Copy the full SHA 2a57a7bView commit details -
[PECO-1008] Add retry strategy based on idempotency of requests (data…
…bricks#264) ## Changes <!-- Summary of your changes that are easy to understand --> This is a duplicate of databricks#249. Had to recreate because of commit signing issues.
Configuration menu - View commit details
-
Copy full SHA for 8bdc601 - Browse repository at this point
Copy the full SHA 8bdc601View commit details -
Fix remaining Java integration tests (databricks#265)
## Changes After migrating GCP to a new E2 account, several private preview APIs need to be enabled before we can run integration tests on them. This PR disables those tests until then (PrivateAccessIT, VPCEndpointsIT). Also, I noticed that QueriesIT was very slow due to using a very small page size (2), resulting in it making 1000s of requests serially. This caused integration tests to take over 2 hours to complete. I increased this to 1000 which should cause the test to finish a bit more quickly (on the order of minutes). ## Tests <!-- How is this tested? -->
Configuration menu - View commit details
-
Copy full SHA for c6a0ff7 - Browse repository at this point
Copy the full SHA c6a0ff7View commit details -
Configuration menu - View commit details
-
Copy full SHA for b7fa8df - Browse repository at this point
Copy the full SHA b7fa8dfView commit details -
Configuration menu - View commit details
-
Copy full SHA for b41ed29 - Browse repository at this point
Copy the full SHA b41ed29View commit details -
Configuration menu - View commit details
-
Copy full SHA for 610533e - Browse repository at this point
Copy the full SHA 610533eView commit details -
Configuration menu - View commit details
-
Copy full SHA for 009a458 - Browse repository at this point
Copy the full SHA 009a458View commit details -
Configuration menu - View commit details
-
Copy full SHA for e927632 - Browse repository at this point
Copy the full SHA e927632View commit details -
Configuration menu - View commit details
-
Copy full SHA for cd86996 - Browse repository at this point
Copy the full SHA cd86996View commit details -
Configuration menu - View commit details
-
Copy full SHA for 983526f - Browse repository at this point
Copy the full SHA 983526fView commit details -
Configuration menu - View commit details
-
Copy full SHA for 3bd1715 - Browse repository at this point
Copy the full SHA 3bd1715View commit details -
Configuration menu - View commit details
-
Copy full SHA for 6d6e2ec - Browse repository at this point
Copy the full SHA 6d6e2ecView commit details -
Fix one-shot list APIs to not return null (databricks#266)
## Changes Currently, non-paginated list APIs in the Java SDK simply return the field in the response with the listed items. However, when there are no items in the collection being listed. This PR changes all list APIs to return a Paginator so that the API response is never `null`. Resolves databricks#197. ## Tests Tested listing clusters using an SP with no access to any clusters, which passed: ``` 10:12 [DEBUG] > GET /api/2.0/clusters/list < 200 OK < { } ```
Configuration menu - View commit details
-
Copy full SHA for e2ec646 - Browse repository at this point
Copy the full SHA e2ec646View commit details -
Remove unnecessary secret from example (databricks#267)
## Changes A secret was committed to the repo in an example. This PR removes it. The underlying secret has been revoked so it is not usable any longer. ## Tests <!-- How is this tested? -->
Configuration menu - View commit details
-
Copy full SHA for 4677878 - Browse repository at this point
Copy the full SHA 4677878View commit details -
Fix one shot pagination (databricks#268)
## Changes databricks#266 unintentionally broke one-shot list APIs because the single page would never be fetched. This PR forces that first page to be fetched, restoring the original behavior. ## Tests - [x] Ran SecretsIT locally, and it worked.
Configuration menu - View commit details
-
Copy full SHA for 37c161a - Browse repository at this point
Copy the full SHA 37c161aView commit details -
Update SDK to OpenAPI spec (databricks#269)
## Changes <!-- Summary of your changes that are easy to understand --> Updating SDK to latest OpenAPI specification. ## Tests <!-- How is this tested? --> Unit tests, will run nightly on the release PR.
Configuration menu - View commit details
-
Copy full SHA for a2549d9 - Browse repository at this point
Copy the full SHA a2549d9View commit details