Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v0.1.10 #181

Merged
merged 3 commits into from
Jun 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
# Version changelog

## 0.1.10

* Regenerate from OpenAPI spec ([#176](https://github.com/databricks/databricks-sdk-py/pull/176)).
* Added improved notebook-native authentication ([#152](https://github.com/databricks/databricks-sdk-py/pull/152)).
* Added methods to provide extra user agent and upstream user agent to SDK config ([#163](https://github.com/databricks/databricks-sdk-py/pull/163)).
* Added more missing `Optional` type hints ([#171](https://github.com/databricks/databricks-sdk-py/pull/171), [#177](https://github.com/databricks/databricks-sdk-py/pull/177)).
* Correctly serialize external entities ([#178](https://github.com/databricks/databricks-sdk-py/pull/178)).
* Correctly serialize external enum values in paths ([#179](https://github.com/databricks/databricks-sdk-py/pull/179)).
* Mark non-required fields as `Optional` ([#170](https://github.com/databricks/databricks-sdk-py/pull/170)).
* Synchronize auth permutation tests with Go SDK ([#165](https://github.com/databricks/databricks-sdk-py/pull/165)).

## 0.1.9

* Added new services from OpenAPI spec ([#145](https://github.com/databricks/databricks-sdk-py/pull/145), [#159](https://github.com/databricks/databricks-sdk-py/pull/159)).
Expand Down
14 changes: 7 additions & 7 deletions databricks/sdk/mixins/workspace.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
from typing import BinaryIO, Iterator, Optional

from ..core import DatabricksError
from ..service.workspace import (ExportFormat, Language, ObjectInfo,
ObjectType, WorkspaceAPI)
from ..service.workspace import (ExportFormat, ImportFormat, Language,
ObjectInfo, ObjectType, WorkspaceAPI)


def _fqcn(x: any) -> str:
Expand Down Expand Up @@ -31,7 +31,7 @@ def upload(self,
path: str,
content: BinaryIO,
*,
format: Optional[ExportFormat] = None,
format: Optional[ImportFormat] = None,
language: Optional[Language] = None,
overwrite: Optional[bool] = False) -> None:
"""
Expand All @@ -44,17 +44,17 @@ def upload(self,

:param path: target location of the file on workspace.
:param content: file-like `io.BinaryIO` of the `path` contents.
:param format: By default, `ExportFormat.SOURCE`. If using `ExportFormat.AUTO` the `path`
:param format: By default, `ImportFormat.SOURCE`. If using `ImportFormat.AUTO` the `path`
is imported or exported as either a workspace file or a notebook, depending
on an analysis of the `item`’s extension and the header content provided in
the request. In addition, if the `path` is imported as a notebook, then
the `item`’s extension is automatically removed.
:param language: Only required if using `ExportFormat.SOURCE`.
"""
if format is not None and not isinstance(format, ExportFormat):
if format is not None and not isinstance(format, ImportFormat):
raise ValueError(
f'format is expected to be {_fqcn(ExportFormat)}, but got {_fqcn(format.__class__)}')
if (not format or format == ExportFormat.SOURCE) and not language:
f'format is expected to be {_fqcn(ImportFormat)}, but got {_fqcn(format.__class__)}')
if (not format or format == ImportFormat.SOURCE) and not language:
suffixes = {
'.py': Language.PYTHON,
'.sql': Language.SQL,
Expand Down
2 changes: 1 addition & 1 deletion databricks/sdk/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = '0.1.9'
__version__ = '0.1.10'
8 changes: 4 additions & 4 deletions tests/integration/test_workspace.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import io

from databricks.sdk.service.workspace import ExportFormat, Language
from databricks.sdk.service.workspace import ImportFormat, Language


def test_workspace_recursive_list(w, random):
Expand All @@ -24,7 +24,7 @@ def test_workspace_upload_download_notebooks(w, random):
def test_workspace_upload_download_files(w, random):
py_file = f'/Users/{w.current_user.me().user_name}/file-{random(12)}.py'

w.workspace.upload(py_file, io.BytesIO(b'print(1)'), format=ExportFormat.AUTO)
w.workspace.upload(py_file, io.BytesIO(b'print(1)'), format=ImportFormat.AUTO)
with w.workspace.download(py_file) as f:
content = f.read()
assert content == b'print(1)'
Expand All @@ -35,7 +35,7 @@ def test_workspace_upload_download_files(w, random):
def test_workspace_upload_download_txt_files(w, random):
txt_file = f'/Users/{w.current_user.me().user_name}/txt-{random(12)}.txt'

w.workspace.upload(txt_file, io.BytesIO(b'print(1)'), format=ExportFormat.AUTO)
w.workspace.upload(txt_file, io.BytesIO(b'print(1)'), format=ImportFormat.AUTO)
with w.workspace.download(txt_file) as f:
content = f.read()
assert content == b'print(1)'
Expand All @@ -46,7 +46,7 @@ def test_workspace_upload_download_txt_files(w, random):
def test_workspace_upload_download_notebooks_no_extension(w, random):
nb = f'/Users/{w.current_user.me().user_name}/notebook-{random(12)}'

w.workspace.upload(nb, io.BytesIO(b'print(1)'), format=ExportFormat.SOURCE, language=Language.PYTHON)
w.workspace.upload(nb, io.BytesIO(b'print(1)'), format=ImportFormat.SOURCE, language=Language.PYTHON)
with w.workspace.download(nb) as f:
content = f.read()
assert content == b'# Databricks notebook source\nprint(1)'
Expand Down