Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Recover on download failures #844

Closed
ksafonov-db opened this issue Dec 30, 2024 · 0 comments
Closed

[FEATURE] Recover on download failures #844

ksafonov-db opened this issue Dec 30, 2024 · 0 comments
Labels
Feature Request The issue is a request for enhancement or new functionality rather than a bug.

Comments

@ksafonov-db
Copy link
Contributor

Problem Statement
Currently download using Files API client fails on network error or server timeout.

Proposed Solution
Make Files API client automatically re-connect to the server and resume the download in case of temporary failures.

ksafonov-db added a commit to ksafonov-db/databricks-sdk-py that referenced this issue Jan 8, 2025
ksafonov-db added a commit to ksafonov-db/databricks-sdk-py that referenced this issue Jan 8, 2025
Signed-off-by: Kirill Safonov <kirill.safonov@databricks.com>
github-merge-queue bot pushed a commit that referenced this issue Jan 8, 2025
## What changes are proposed in this pull request?

1. Extending Files API client to support resuming download on failures.
New implementation tracks current offset in the input stream and issues
a new download request from this point in case of an error.
2. New code path is enabled by
'DATABRICKS_ENABLE_EXPERIMENTAL_FILES_API_CLIENT' config parameter.

## How is this tested?

Added unit tests for the new code path:
`% python3 -m pytest tests/test_files.py`

---------

Signed-off-by: Kirill Safonov <kirill.safonov@databricks.com>
renaudhartert-db added a commit that referenced this issue Jan 20, 2025
### New Features and Improvements

 * Add `serving.http_request` to call external functions. ([#857](#857)).
 * Files API client: recover on download failures ([#844](#844)) ([#845](#845)).

### Bug Fixes

 * Properly pass query parameters in apps and oauth2 ([#862](#862)).

### Internal Changes

 * Add unit tests for external-browser authentication ([#863](#863)).
 * Decouple oauth2 and serving  ([#855](#855)).
 * Migrate workflows that need write access to use hosted runners ([#850](#850)).
 * Stop testing Python 3.7 on Ubuntu ([#858](#858)).

### API Changes:

 * Added [w.access_control](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/access_control.html) workspace-level service.
 * Added `http_request()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service.
 * Added `no_compute` field for `databricks.sdk.service.apps.CreateAppRequest`.
 * Added `has_more` field for `databricks.sdk.service.jobs.BaseJob`.
 * Added `has_more` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `page_token` field for `databricks.sdk.service.jobs.GetJobRequest`.
 * Added `has_more` and `next_page_token` fields for `databricks.sdk.service.jobs.Job`.
 * Added `has_more` field for `databricks.sdk.service.jobs.Run`.
 * Added `clean_rooms_notebook_output` field for `databricks.sdk.service.jobs.RunOutput`.
 * Added `scopes` field for `databricks.sdk.service.oauth2.UpdateCustomAppIntegration`.
 * Added `run_as` field for `databricks.sdk.service.pipelines.CreatePipeline`.
 * Added `run_as` field for `databricks.sdk.service.pipelines.EditPipeline`.
 * Added `authorization_details` and `endpoint_url` fields for `databricks.sdk.service.serving.DataPlaneInfo`.
 * Added `contents` field for `databricks.sdk.service.serving.GetOpenApiResponse`.
 * Added `activated`, `activation_url`, `authentication_type`, `cloud`, `comment`, `created_at`, `created_by`, `data_recipient_global_metastore_id`, `ip_access_list`, `metastore_id`, `name`, `owner`, `properties_kvpairs`, `region`, `sharing_code`, `tokens`, `updated_at` and `updated_by` fields for `databricks.sdk.service.sharing.RecipientInfo`.
 * Added `expiration_time` field for `databricks.sdk.service.sharing.RecipientInfo`.
 * Added .
 * Added .
 * Added , ,  and .
 * Added .
 * Added , , ,  and .
 * Changed `update()` method for [a.account_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_federation_policy.html) account-level service with new required argument order.
 * Changed `update()` method for [a.service_principal_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/service_principal_federation_policy.html) account-level service with new required argument order.
 * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service to return `databricks.sdk.service.sharing.RecipientInfo` dataclass.
 * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service return type to become non-empty.
 * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service to type `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service.
 * Changed `get_open_api()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service return type to become non-empty.
 * Changed `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to type `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service.
 * Changed `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to return `databricks.sdk.service.serving.EndpointTags` dataclass.
 * Changed `databricks.sdk.service.serving.EndpointTagList` dataclass to.
 * Changed `collaborator_alias` field for `databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be required.
 * Changed `collaborator_alias` field for `databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be required.
 * Changed `update_mask` field for `databricks.sdk.service.oauth2.UpdateAccountFederationPolicyRequest` to no longer be required.
 * Changed `update_mask` field for `databricks.sdk.service.oauth2.UpdateServicePrincipalFederationPolicyRequest` to no longer be required.
 * Changed `days_of_week` field for `databricks.sdk.service.pipelines.RestartWindow` to type `databricks.sdk.service.pipelines.DayOfWeekList` dataclass.
 * Changed `behavior` field for `databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no longer be required.
 * Changed `behavior` field for `databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no longer be required.
 * Changed `project_id` and `region` fields for `databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be required.
 * Changed `project_id` and `region` fields for `databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be required.
 * Changed `workload_type` field for `databricks.sdk.service.serving.ServedEntityInput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass.
 * Changed `workload_type` field for `databricks.sdk.service.serving.ServedEntityOutput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass.
 * Changed `workload_type` field for `databricks.sdk.service.serving.ServedModelOutput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass.
 * Changed .
 * Changed .

OpenAPI SHA: 58905570a9928fc9ed31fba14a2edaf9a7c55b08, Date: 2025-01-20
github-merge-queue bot pushed a commit that referenced this issue Jan 20, 2025
### New Features and Improvements

* Add `serving.http_request` to call external functions.
([#857](#857)).
* Files API client: recover on download failures
([#844](#844))
([#845](#845)).


### Bug Fixes

* Properly pass query parameters in apps and oauth2
([#862](#862)).


### Internal Changes

* Add unit tests for external-browser authentication
([#863](#863)).
* Decouple oauth2 and serving
([#855](#855)).
* Migrate workflows that need write access to use hosted runners
([#850](#850)).
* Stop testing Python 3.7 on Ubuntu
([#858](#858)).


### API Changes:

* Added
[w.access_control](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/access_control.html)
workspace-level service.
* Added `http_request()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html)
workspace-level service.
* Added `no_compute` field for
`databricks.sdk.service.apps.CreateAppRequest`.
 * Added `has_more` field for `databricks.sdk.service.jobs.BaseJob`.
 * Added `has_more` field for `databricks.sdk.service.jobs.BaseRun`.
* Added `page_token` field for
`databricks.sdk.service.jobs.GetJobRequest`.
* Added `has_more` and `next_page_token` fields for
`databricks.sdk.service.jobs.Job`.
 * Added `has_more` field for `databricks.sdk.service.jobs.Run`.
* Added `clean_rooms_notebook_output` field for
`databricks.sdk.service.jobs.RunOutput`.
* Added `scopes` field for
`databricks.sdk.service.oauth2.UpdateCustomAppIntegration`.
* Added `run_as` field for
`databricks.sdk.service.pipelines.CreatePipeline`.
* Added `run_as` field for
`databricks.sdk.service.pipelines.EditPipeline`.
* Added `authorization_details` and `endpoint_url` fields for
`databricks.sdk.service.serving.DataPlaneInfo`.
* Added `contents` field for
`databricks.sdk.service.serving.GetOpenApiResponse`.
* Added `activated`, `activation_url`, `authentication_type`, `cloud`,
`comment`, `created_at`, `created_by`,
`data_recipient_global_metastore_id`, `ip_access_list`, `metastore_id`,
`name`, `owner`, `properties_kvpairs`, `region`, `sharing_code`,
`tokens`, `updated_at` and `updated_by` fields for
`databricks.sdk.service.sharing.RecipientInfo`.
* Added `expiration_time` field for
`databricks.sdk.service.sharing.RecipientInfo`.
* Changed `update()` method for
[a.account_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_federation_policy.html)
account-level service with new required argument order.
* Changed `update()` method for
[a.service_principal_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/service_principal_federation_policy.html)
account-level service with new required argument order.
* Changed `update()` method for
[w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html)
workspace-level service to return
`databricks.sdk.service.sharing.RecipientInfo` dataclass.
* Changed `update()` method for
[w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html)
workspace-level service return type to become non-empty.
* Changed `update()` method for
[w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html)
workspace-level service to type `update()` method for
[w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html)
workspace-level service.
* Changed `get_open_api()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html)
workspace-level service return type to become non-empty.
* Changed `patch()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html)
workspace-level service to type `patch()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html)
workspace-level service.
* Changed `patch()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html)
workspace-level service to return
`databricks.sdk.service.serving.EndpointTags` dataclass.
* Changed `databricks.sdk.service.serving.EndpointTagList` dataclass to.
* Changed `collaborator_alias` field for
`databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be
required.
* Changed `collaborator_alias` field for
`databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be
required.
* Changed `update_mask` field for
`databricks.sdk.service.oauth2.UpdateAccountFederationPolicyRequest` to
no longer be required.
* Changed `update_mask` field for
`databricks.sdk.service.oauth2.UpdateServicePrincipalFederationPolicyRequest`
to no longer be required.
* Changed `days_of_week` field for
`databricks.sdk.service.pipelines.RestartWindow` to type
`databricks.sdk.service.pipelines.DayOfWeekList` dataclass.
* Changed `behavior` field for
`databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no
longer be required.
* Changed `behavior` field for
`databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no
longer be required.
* Changed `project_id` and `region` fields for
`databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be
required.
* Changed `project_id` and `region` fields for
`databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be
required.
* Changed `workload_type` field for
`databricks.sdk.service.serving.ServedEntityInput` to type
`databricks.sdk.service.serving.ServingModelWorkloadType` dataclass.
* Changed `workload_type` field for
`databricks.sdk.service.serving.ServedEntityOutput` to type
`databricks.sdk.service.serving.ServingModelWorkloadType` dataclass.
* Changed `workload_type` field for
`databricks.sdk.service.serving.ServedModelOutput` to type
`databricks.sdk.service.serving.ServingModelWorkloadType` dataclass.

OpenAPI SHA: 58905570a9928fc9ed31fba14a2edaf9a7c55b08, Date: 2025-01-20

---------

Signed-off-by: Renaud Hartert <renaud.hartert@databricks.com>
@parthban-db parthban-db added the Feature Request The issue is a request for enhancement or new functionality rather than a bug. label Jan 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature Request The issue is a request for enhancement or new functionality rather than a bug.
Projects
None yet
Development

No branches or pull requests

2 participants