-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for UC Volumes to the databricks fs
commands
#1209
Conversation
Usage: databricks fs [command] Available Commands: cat Show file content cp Copy files and directories to and from DBFS. ls Lists files mkdir Make directories rm Remove files and directories from dbfs. Flags: -h, --help help for fs Global Flags: --debug enable debug logging -o, --output type output type: text or json (default text) -p, --profile string ~/.databrickscfg profile -t, --target string bundle target to use (if applicable) Use "databricks fs [command] --help" for more information about a command. commands
databricks fs
commands
return FileDoesNotExistError{absPath} | ||
} | ||
|
||
// This API returns 409 if the underlying path is a directory. | ||
if aerr.StatusCode == http.StatusConflict { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The API now allows deleting empty directories
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #1209 +/- ##
==========================================
- Coverage 52.60% 52.05% -0.55%
==========================================
Files 308 308
Lines 17223 17470 +247
==========================================
+ Hits 9060 9094 +34
- Misses 7495 7706 +211
- Partials 668 670 +2 ☔ View full report in Codecov by Sentry. |
databricks fs
commandsdatabricks fs
commands
databricks fs
commandsdatabricks fs
commands
@@ -152,9 +152,6 @@ func newCpCommand() *cobra.Command { | |||
cmd.RunE = func(cmd *cobra.Command, args []string) error { | |||
ctx := cmd.Context() | |||
|
|||
// TODO: Error if a user uses '\' as path separator on windows when "file" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
underlying issue is closed now
@@ -65,6 +94,3 @@ func TestAccFsCatDoesNotSupportOutputModeJson(t *testing.T) { | |||
_, _, err = RequireErrorRun(t, "fs", "cat", "dbfs:"+path.Join(tmpDir, "hello.txt"), "--output=json") | |||
assert.ErrorContains(t, err, "json output not supported") | |||
} | |||
|
|||
// TODO: Add test asserting an error when cat is called on an directory. Need this to be |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added TestAccFsCatOnADir
for this TODO
} | ||
|
||
func (w *FilesClient) Mkdir(ctx context.Context, name string) error { | ||
// Directories are created implicitly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change allows empty directories to be created, something that was not possible before.
databricks fs
commandsdatabricks fs
commands
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall look good, see the comments on concurrent execution. Also, could you trigger an integration test run for this PR to make sure all tests passed correctly?
return err | ||
} | ||
|
||
// This API returns a 404 if the file doesn't exist. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
// This API returns a 404 if the file doesn't exist. | |
// This API returns a 404 if the directory doesn't exist. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for the errors returned by the Files API.
Leaving the review of the actual code to the team.
CLI: * Add support for UC Volumes to the `databricks fs` commands ([#1209](#1209)). Bundles: * Use dynamic configuration model in bundles ([#1098](#1098)). * Allow use of variables references in primitive non-string fields ([#1219](#1219)). * Add an experimental default-sql template ([#1051](#1051)). * Add an experimental dbt-sql template ([#1059](#1059)). Internal: * Add fork-user to winget release workflow ([#1214](#1214)). * Use `any` as type for data sources and resources in `tf/schema` ([#1216](#1216)). * Avoid infinite recursion when normalizing a recursive type ([#1213](#1213)). * Fix issue where interpolating a new ref would rewrite unrelated fields ([#1217](#1217)). * Use `dyn.Value` as input to generating Terraform JSON ([#1218](#1218)). API Changes: * Changed `databricks lakehouse-monitors update` command with new required argument order. * Added `databricks online-tables` command group. OpenAPI commit cdd76a98a4fca7008572b3a94427566dd286c63b (2024-02-19) Dependency updates: * Bump Terraform provider to v1.36.2 ([#1215](#1215)). * Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 ([#1222](#1222)).
CLI: * Add support for UC Volumes to the `databricks fs` commands ([#1209](#1209)). Bundles: * Use dynamic configuration model in bundles ([#1098](#1098)). * Allow use of variables references in primitive non-string fields ([#1219](#1219)). * Add an experimental default-sql template ([#1051](#1051)). * Add an experimental dbt-sql template ([#1059](#1059)). Internal: * Add fork-user to winget release workflow ([#1214](#1214)). * Use `any` as type for data sources and resources in `tf/schema` ([#1216](#1216)). * Avoid infinite recursion when normalizing a recursive type ([#1213](#1213)). * Fix issue where interpolating a new ref would rewrite unrelated fields ([#1217](#1217)). * Use `dyn.Value` as input to generating Terraform JSON ([#1218](#1218)). API Changes: * Changed `databricks lakehouse-monitors update` command with new required argument order. * Added `databricks online-tables` command group. OpenAPI commit cdd76a98a4fca7008572b3a94427566dd286c63b (2024-02-19) Dependency updates: * Bump Terraform provider to v1.36.2 ([#1215](#1215)). * Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 ([#1222](#1222)).
Changes
This PR adds support for UC Volumes to the fs commands. The fs commands for UC volumes work the same as they currently do for DBFS. This is ensured by running the same test matrix we across both DBFS and UC Volumes versions of the fs commands.
Tests
Support for UC volumes is tested by running the same tests as we did originally for DBFS commands. The tests require a
main
catalog to exist in the workspace, which does in our test workspaces environments which have theTEST_METASTORE_ID
environment variable set.For the Files API filer, we do the same by running mostly common tests to ensure the filers for "local", "wsfs", "dbfs" and "files API" are consistent.
The tests are also made to all run in parallel to reduce the time taken. To ensure the separation of the tests, each test creates its own UC schema (for UC volumes tests) or DBFS directories (for DBFS tests).