Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot import name 'BlobServiceClient' #13185

Closed
JavierLopezT opened this issue Dec 19, 2020 · 12 comments
Closed

Cannot import name 'BlobServiceClient' #13185

JavierLopezT opened this issue Dec 19, 2020 · 12 comments

Comments

@JavierLopezT
Copy link
Contributor

JavierLopezT commented Dec 19, 2020

I am using Airflow with docker-compose and Dockerfile. I have just changed in the Dockerfile the version from 1.10.10 to 1.10.14, and I get this import error in all files which involve Snowflake. I am not using Azure at all.
In one particular DAG I also get the error `An HTTP Client raised and unhandled exception: cannot import name 'BlobServiceClient'

Here is the full trace of the error:

Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/httpsession.py", line 263, in send
chunked=self._chunked(request.headers),
File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 677, in urlopen
chunked=chunked,
File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 381, in _make_request
self._validate_conn(conn)
File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 978, in _validate_conn
conn.connect()
File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connection.py", line 371, in connect
ssl_context=context,
File "/home/airflow/.local/lib/python3.6/site-packages/snowflake/connector/ssl_wrap_socket.py", line 74, in ssl_wrap_socket_with_ocsp
from .ocsp_asn1crypto import SnowflakeOCSPAsn1Crypto as SFOCSP
File "/home/airflow/.local/lib/python3.6/site-packages/snowflake/connector/__init__.py", line 17, in <module>
from .connection import SnowflakeConnection
File "/home/airflow/.local/lib/python3.6/site-packages/snowflake/connector/connection.py", line 48, in <module>
from .cursor import LOG_MAX_QUERY_LENGTH, SnowflakeCursor
File "/home/airflow/.local/lib/python3.6/site-packages/snowflake/connector/cursor.py", line 30, in <module>
from .file_transfer_agent import SnowflakeFileTransferAgent
File "/home/airflow/.local/lib/python3.6/site-packages/snowflake/connector/file_transfer_agent.py", line 22, in <module>
from .azure_util import SnowflakeAzureUtil
File "/home/airflow/.local/lib/python3.6/site-packages/snowflake/connector/azure_util.py", line 13, in <module>
from azure.storage.blob import BlobServiceClient, ContentSettings, ExponentialRetry
ImportError: cannot import name 'BlobServiceClient'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dagbag.py", line 256, in process_file
m = imp.load_source(mod_name, filepath)
File "/usr/local/lib/python3.6/imp.py", line 172, in load_source
module = _load(spec)
File "<frozen importlib._bootstrap>", line 684, in _load
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/opt/airflow/dags/export_google_ads_buyers_list/dag_export_google_ads_buyers_list.py", line 10, in <module>
dag_globals = Variable.get('variables_secret', deserialize_json=True)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/variable.py", line 112, in get
var_val = get_variable(key=key)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/secrets/__init__.py", line 71, in get_variable
var_val = secrets_backend.get_variable(key=key)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/secrets/aws_secrets_manager.py", line 147, in get_variable
return self._get_secret(key)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/secrets/aws_secrets_manager.py", line 159, in _get_secret
SecretId=secret_id,
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/client.py", line 316, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/client.py", line 613, in _make_api_call
operation_model, request_dict, request_context)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/client.py", line 632, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/endpoint.py", line 137, in _send_request
success_response, exception):
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/endpoint.py", line 231, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/retryhandler.py", line 269, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/endpoint.py", line 200, in _do_get_response
http_response = self._send(request)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/endpoint.py", line 244, in _send
return self.http_session.send(request)
File "/home/airflow/.local/lib/python3.6/site-packages/botocore/httpsession.py", line 299, in send
raise HTTPClientError(error=e)
botocore.exceptions.HTTPClientError: An HTTP Client raised and unhandled exception: cannot import name 'BlobServiceClient'

`
Might be related with #13093

@JavierLopezT JavierLopezT added the kind:bug This is a clearly a bug label Dec 19, 2020
@turbaszek
Copy link
Member

What version of snowflake connector do you use? I think I saw this error some time ago...

@JavierLopezT
Copy link
Contributor Author

What version of snowflake connector do you use? I think I saw this error some time ago...

2.3.7. I have downgraded it to 2.3.6 and now I don't have the error. Should I close the issue or it can still be airflow-related?

@HaloKo4
Copy link

HaloKo4 commented Dec 20, 2020

You probably have conflicting versions of azure-storage-blob
See:
https://github.com/apache/airflow/blob/master/setup.py#L395

snowflake-connector-python have'azure-storage-blob>=12.0.0,<13.0.0'
snowflakedb/snowflake-connector-python#225

@mik-laj
Copy link
Member

mik-laj commented Dec 20, 2020

Related: #12508 #11968 #9745

@arbass22
Copy link

What end up working for me was explicitly uninstall the older version of azure-storage
pip uninstall azure-storage
The newer version is split into multiple packages and it seemed like because of this the conflicting versions were not noticed by pip.

@eladkal
Copy link
Contributor

eladkal commented Jan 25, 2021

@JavierLopezT I believe this issue is fixed by #13654 , #13286

@feluelle
Copy link
Member

This is still an issue for me with latest 1.10.x and latest backport providers release.

@potiuk
Copy link
Member

potiuk commented Feb 18, 2021

This is still an issue for me with latest 1.10.x and latest backport providers release.

Hey @feluelle You likely have the old azure-storage package installed (or it was installed in the past and removed without reinstalling azure-storage-*. The problem is that there are two sets of libraries for azure in PyPI:

The problem was that Azure did a very, very bad thing. Both libraries have overlapping packages and they can override each other when both legacy and new libraries are installed. If you have azure-storage and azure-storage-blob, the set of classes available in azure.storage.blob depend on which is the SEQUENCE of installing the two libraries and whethere they were installed with --force or not. Total mess.

So if you want to bring the situation under control - remove azure-storage and reinstall azure-storage-* packages you have. That should help with the error. If you install 2.0.1+ from scratch, the problem is gone (we have no more azure-storage dependency. However in 1.10 you need to install airflow without [azure] extra and then add apache-airflow-backport-providers-azure. Or remove azure-storage and reinstall the azure-storage-blob.

Let me know if either of those worked.

I hope it helps. Maybe also someone could describe it nicely somewhere else in the docs ;) ?

@potiuk
Copy link
Member

potiuk commented Feb 18, 2021

For now I am closing it with the information that you need to cleanup it in the way described above - I will re-open if I hear otherwise :)

@potiuk potiuk closed this as completed Feb 18, 2021
@feluelle
Copy link
Member

However in 1.10 you need to install airflow without [azure] extra and then add apache-airflow-backport-providers-azure. Or remove azure-storage and reinstall the azure-storage-blob.

Why can't we ship the fix to 1.10.x release? :)

I don't have azure as an explicit dependency. So I would have to remove azure-storage and reinstall the azure-storage-blob. Can this be automate somehow? :D I mean adding sth to requirements. Or do I have to write bash for that one?

@potiuk
Copy link
Member

potiuk commented Feb 18, 2021

Why can't we ship the fix to 1.10.x release? :)

We could potentially. But it would require #12188 to be back-ported (and possibly few other PRs. I am sure @ephraimbuddy would not be so happy because I think that was one of the most "troublesome" changes he went through ;)

Honestly - it is much simpler via backports, even if requires some of hassle with installed packages.

I don't have azure as an explicit dependency. So I would have to remove azure-storage and reinstall the azure-storage-blob. Can this be automate somehow? :D I mean adding sth to requirements. Or do I have to write bash for that one?

We could potentially add azure-storage==0.37.0. One "good" thing Azure team did there was that they added 0.37.0 version of the deprecated package that removes the "azure.storage" folder. BUT. This is again tricky. Because depending on the order of installation we can first install azure-storage-blob, then azure-storage==0.36.0 then azure-storage== 0.37.0 and puff! whole azure.storage dir is gone.

It would have to be an external script. I think it's worth to add. such snippet of 'pip uninistall && pip install ` would be nice but I'd say it should be in documentation. We really want to limit our investment in 1.10 and if we can document workaround and tell people ("but in 2.0 you do not have to do it - so migrate") - this is the best we can do.

@feluelle
Copy link
Member

feluelle commented Feb 19, 2021

It would have to be an external script. I think it's worth to add. such snippet of 'pip uninistall && pip install ` would be nice but I'd say it should be in documentation. We really want to limit our investment in 1.10 and if we can document workaround and tell people ("but in 2.0 you do not have to do it - so migrate") - this is the best we can do.

SGTM 👍 and thanks for elaborating :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants