-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Client.Timeout while syncing via DAB #1004
Comments
Seems to be similar to #701 The error likely comes from some system level timeout on machine you're using, see for example Is there any chance it might be related to DNS setup on machine you're using? |
yes, I read that issue and tried to set the DNS accordingly but I am still facing the same issue. Could it be something with the size of a file or the naming convention? Any restrictions there or the overall sync volume? |
what auth type do you use? azure SPN? azure CLI? azure MSI? you can see that in when you add |
I used both Databricks SPN and Databricks PAT as we are on AWS, not Azure |
Any updates? |
We released a new CLI version (0.215.0) which contains the fix for the timeouts databricks/databricks-sdk-go#837 Please upgrade and give it a try. |
When trying to deploy a new bundle file via
databricks bundle deploy -t dev
and syncing my local files (a few hundred MB) to the workspace, I get the following error:Error: Post "https:///api/2.0/workspace-files/import-file/Users%2F%<PATH_TO_FILE>overwrite=true": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
Timeout appears after approx. 90 seconds, not always with the same file...
I am on Windows 10 and I am using databricks CLI version 0.209.0
The text was updated successfully, but these errors were encountered: