Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cant import any of the HuggingFaceEmbeddings because 'openssl' has no attribute 'ciphers' #27624

Open
5 tasks done
suvraadeep opened this issue Oct 24, 2024 · 1 comment
Open
5 tasks done
Labels
investigate Flagged for investigation.

Comments

@suvraadeep
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_community.embeddings import HuggingFaceBgeEmbeddings

model_name = "BAAI/bge-small-en"
model_kwargs = {"device": "cpu"}
encode_kwargs = {"normalize_embeddings": True}
hf = HuggingFaceBgeEmbeddings(
    model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs
)

embedding = hf.embed_query("hi this is harrison")
embedding

or

from langchain_huggingface import HuggingFaceEmbeddings
embeddings=HuggingFaceEmbeddings(model_name="all-MiniLM-L6-v2")

Error Message and Stack Trace (if applicable)

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1764, in _LazyModule._get_module(self, module_name)
   1763 try:
-> 1764     return importlib.import_module("." + module_name, self.__name__)
   1765 except Exception as e:

File /opt/conda/lib/python3.10/importlib/__init__.py:126, in import_module(name, package)
    125         level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1006, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:688, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:883, in exec_module(self, module)

File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds)

File /opt/conda/lib/python3.10/site-packages/transformers/modeling_tf_utils.py:34
     33 import numpy as np
---> 34 import tensorflow as tf
     35 from packaging.version import parse

File /opt/conda/lib/python3.10/site-packages/tensorflow/__init__.py:45
     43 _tf2.enable()
---> 45 from tensorflow._api.v2 import __internal__
     46 from tensorflow._api.v2 import __operators__

File /opt/conda/lib/python3.10/site-packages/tensorflow/_api/v2/__internal__/__init__.py:11
     10 from tensorflow._api.v2.__internal__ import dispatch
---> 11 from tensorflow._api.v2.__internal__ import distribute
     12 from tensorflow._api.v2.__internal__ import eager_context

File /opt/conda/lib/python3.10/site-packages/tensorflow/_api/v2/__internal__/distribute/__init__.py:8
      6 import sys as _sys
----> 8 from tensorflow._api.v2.__internal__.distribute import combinations
      9 from tensorflow._api.v2.__internal__.distribute import interim

File /opt/conda/lib/python3.10/site-packages/tensorflow/_api/v2/__internal__/distribute/combinations/__init__.py:8
      6 import sys as _sys
----> 8 from tensorflow.python.distribute.combinations import env # line: 456
      9 from tensorflow.python.distribute.combinations import generate # line: 365

File /opt/conda/lib/python3.10/site-packages/tensorflow/python/distribute/combinations.py:33
     32 from tensorflow.python.client import session
---> 33 from tensorflow.python.distribute import collective_all_reduce_strategy
     34 from tensorflow.python.distribute import distribute_lib

File /opt/conda/lib/python3.10/site-packages/tensorflow/python/distribute/collective_all_reduce_strategy.py:32
     31 from tensorflow.python.distribute import input_util
---> 32 from tensorflow.python.distribute import mirrored_strategy
     33 from tensorflow.python.distribute import multi_worker_util

File /opt/conda/lib/python3.10/site-packages/tensorflow/python/distribute/mirrored_strategy.py:34
     33 from tensorflow.python.distribute import values_util
---> 34 from tensorflow.python.distribute.cluster_resolver import tfconfig_cluster_resolver
     35 from tensorflow.python.distribute.v1 import input_lib as input_lib_v1

File /opt/conda/lib/python3.10/site-packages/tensorflow/python/distribute/cluster_resolver/__init__.py:27
     26 from tensorflow.python.distribute.cluster_resolver.cluster_resolver import UnionClusterResolver
---> 27 from tensorflow.python.distribute.cluster_resolver.gce_cluster_resolver import GCEClusterResolver
     28 from tensorflow.python.distribute.cluster_resolver.kubernetes_cluster_resolver import KubernetesClusterResolver

File /opt/conda/lib/python3.10/site-packages/tensorflow/python/distribute/cluster_resolver/gce_cluster_resolver.py:24
     23 try:
---> 24   from googleapiclient import discovery  # pylint: disable=g-import-not-at-top
     25   from oauth2client.client import GoogleCredentials  # pylint: disable=g-import-not-at-top

File /opt/conda/lib/python3.10/site-packages/googleapiclient/discovery.py:45
     44 from google.auth.transport import mtls
---> 45 from google.oauth2 import service_account
     47 # Third-party imports

File /opt/conda/lib/python3.10/site-packages/google/oauth2/service_account.py:77
     76 from google.auth import _helpers
---> 77 from google.auth import _service_account_info
     78 from google.auth import credentials

File /opt/conda/lib/python3.10/site-packages/google/auth/_service_account_info.py:20
     18 import json
---> 20 from google.auth import crypt
     21 from google.auth import exceptions

File /opt/conda/lib/python3.10/site-packages/google/auth/crypt/__init__.py:41
     40 from google.auth.crypt import base
---> 41 from google.auth.crypt import rsa
     43 try:

File /opt/conda/lib/python3.10/site-packages/google/auth/crypt/rsa.py:20
     18 try:
     19     # Prefer cryptograph-based RSA implementation.
---> 20     from google.auth.crypt import _cryptography_rsa
     22     RSASigner = _cryptography_rsa.RSASigner

File /opt/conda/lib/python3.10/site-packages/google/auth/crypt/_cryptography_rsa.py:25
     24 from cryptography.hazmat.primitives import hashes
---> 25 from cryptography.hazmat.primitives import serialization
     26 from cryptography.hazmat.primitives.asymmetric import padding

File /opt/conda/lib/python3.10/site-packages/cryptography/hazmat/primitives/serialization/__init__.py:25
     17 from cryptography.hazmat.primitives.serialization.base import (
     18     load_der_parameters,
     19     load_der_private_key,
   (...)
     23     load_pem_public_key,
     24 )
---> 25 from cryptography.hazmat.primitives.serialization.ssh import (
     26     SSHCertificate,
     27     SSHCertificateBuilder,
     28     SSHCertificateType,
     29     SSHCertPrivateKeyTypes,
     30     SSHCertPublicKeyTypes,
     31     SSHPrivateKeyTypes,
     32     SSHPublicKeyTypes,
     33     load_ssh_private_key,
     34     load_ssh_public_identity,
     35     load_ssh_public_key,
     36 )
     38 __all__ = [
     39     "BestAvailableEncryption",
     40     "Encoding",
   (...)
     62     "load_ssh_public_key",
     63 ]

File /opt/conda/lib/python3.10/site-packages/cryptography/hazmat/primitives/serialization/ssh.py:27
     26 from cryptography.hazmat.primitives.asymmetric import utils as asym_utils
---> 27 from cryptography.hazmat.primitives.ciphers import (
     28     AEADDecryptionContext,
     29     Cipher,
     30     algorithms,
     31     modes,
     32 )
     33 from cryptography.hazmat.primitives.serialization import (
     34     Encoding,
     35     KeySerializationEncryption,
   (...)
     39     _KeySerializationEncryption,
     40 )

File /opt/conda/lib/python3.10/site-packages/cryptography/hazmat/primitives/ciphers/__init__.py:11
      7 from cryptography.hazmat.primitives._cipheralgorithm import (
      8     BlockCipherAlgorithm,
      9     CipherAlgorithm,
     10 )
---> 11 from cryptography.hazmat.primitives.ciphers.base import (
     12     AEADCipherContext,
     13     AEADDecryptionContext,
     14     AEADEncryptionContext,
     15     Cipher,
     16     CipherContext,
     17 )
     19 __all__ = [
     20     "AEADCipherContext",
     21     "AEADDecryptionContext",
   (...)
     26     "CipherContext",
     27 ]

File /opt/conda/lib/python3.10/site-packages/cryptography/hazmat/primitives/ciphers/base.py:143
    133 _CIPHER_TYPE = Cipher[
    134     typing.Union[
    135         modes.ModeWithNonce,
   (...)
    140     ]
    141 ]
--> 143 CipherContext.register(rust_openssl.ciphers.CipherContext)
    144 AEADEncryptionContext.register(rust_openssl.ciphers.AEADEncryptionContext)

AttributeError: module 'openssl' has no attribute 'ciphers'

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1764, in _LazyModule._get_module(self, module_name)
   1763 try:
-> 1764     return importlib.import_module("." + module_name, self.__name__)
   1765 except Exception as e:

File /opt/conda/lib/python3.10/importlib/__init__.py:126, in import_module(name, package)
    125         level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1006, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:688, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:883, in exec_module(self, module)

File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds)

File /opt/conda/lib/python3.10/site-packages/transformers/integrations/integration_utils.py:36
     34 import packaging.version
---> 36 from .. import PreTrainedModel, TFPreTrainedModel
     37 from .. import __version__ as version

File <frozen importlib._bootstrap>:1075, in _handle_fromlist(module, fromlist, import_, recursive)

File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1754, in _LazyModule.__getattr__(self, name)
   1753 elif name in self._class_to_module.keys():
-> 1754     module = self._get_module(self._class_to_module[name])
   1755     value = getattr(module, name)

File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1766, in _LazyModule._get_module(self, module_name)
   1765 except Exception as e:
-> 1766     raise RuntimeError(
   1767         f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
   1768         f" traceback):\n{e}"
   1769     ) from e

RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
module 'openssl' has no attribute 'ciphers'

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
Cell In[5], line 9
      7 model_kwargs = {"device": "cpu"}
      8 encode_kwargs = {"normalize_embeddings": True}
----> 9 hf = HuggingFaceBgeEmbeddings(
     10     model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs
     11 )
     13 embedding = hf.embed_query("hi this is harrison")
     14 embedding

File /opt/conda/lib/python3.10/site-packages/langchain_community/embeddings/huggingface.py:320, in HuggingFaceBgeEmbeddings.__init__(self, **kwargs)
    310     warn_deprecated(
    311         since=since,
    312         removal=removal,
   (...)
    316         + f" {self.__class__.__name__} constructor instead.",
    317     )
    319 try:
--> 320     import sentence_transformers
    322 except ImportError as exc:
    323     raise ImportError(
    324         "Could not import sentence_transformers python package. "
    325         "Please install it with `pip install sentence_transformers`."
    326     ) from exc

File /opt/conda/lib/python3.10/site-packages/sentence_transformers/__init__.py:10
      7 import os
      9 from sentence_transformers.backend import export_dynamic_quantized_onnx_model, export_optimized_onnx_model
---> 10 from sentence_transformers.cross_encoder.CrossEncoder import CrossEncoder
     11 from sentence_transformers.datasets import ParallelSentencesDataset, SentencesDataset
     12 from sentence_transformers.LoggingHandler import LoggingHandler

File /opt/conda/lib/python3.10/site-packages/sentence_transformers/cross_encoder/__init__.py:3
      1 from __future__ import annotations
----> 3 from .CrossEncoder import CrossEncoder
      5 __all__ = ["CrossEncoder"]

File /opt/conda/lib/python3.10/site-packages/sentence_transformers/cross_encoder/CrossEncoder.py:20
     18 from sentence_transformers.evaluation.SentenceEvaluator import SentenceEvaluator
     19 from sentence_transformers.readers import InputExample
---> 20 from sentence_transformers.SentenceTransformer import SentenceTransformer
     21 from sentence_transformers.util import fullname, get_device_name, import_from_string
     23 logger = logging.getLogger(__name__)

File /opt/conda/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py:32
     29 from transformers import is_torch_npu_available
     30 from transformers.dynamic_module_utils import get_class_from_dynamic_module, get_relative_import_files
---> 32 from sentence_transformers.model_card import SentenceTransformerModelCardData, generate_model_card
     33 from sentence_transformers.similarity_functions import SimilarityFunction
     35 from . import __MODEL_HUB_ORGANIZATION__, __version__

File /opt/conda/lib/python3.10/site-packages/sentence_transformers/model_card.py:25
     23 from tqdm.autonotebook import tqdm
     24 from transformers import TrainerCallback
---> 25 from transformers.integrations import CodeCarbonCallback
     26 from transformers.modelcard import make_markdown_table
     27 from transformers.trainer_callback import TrainerControl, TrainerState

File <frozen importlib._bootstrap>:1075, in _handle_fromlist(module, fromlist, import_, recursive)

File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1754, in _LazyModule.__getattr__(self, name)
   1752     value = Placeholder
   1753 elif name in self._class_to_module.keys():
-> 1754     module = self._get_module(self._class_to_module[name])
   1755     value = getattr(module, name)
   1756 else:

File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1766, in _LazyModule._get_module(self, module_name)
   1764     return importlib.import_module("." + module_name, self.__name__)
   1765 except Exception as e:
-> 1766     raise RuntimeError(
   1767         f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
   1768         f" traceback):\n{e}"
   1769     ) from e

RuntimeError: Failed to import transformers.integrations.integration_utils because of the following error (look up to see its traceback):
Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
module 'openssl' has no attribute 'ciphers'

Failed to import transformers.integrations.integration_utils because of the following error (look up to see its traceback): Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback): module 'openssl' has no attribute 'ciphers'

Description

I cant import HuggingFaceBgeEmbeddings and huggingfaceembeddings for any of the available models

System Info

System Information

OS: Linux
OS Version: #1 SMP Thu Jun 27 20:43:36 UTC 2024
Python Version: 3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:45:18) [GCC 12.3.0]

Package Information

langchain_core: 0.3.12
langchain: 0.3.4
langchain_community: 0.3.3
langsmith: 0.1.137
langchain_cohere: 0.3.1
langchain_experimental: 0.3.2
langchain_groq: 0.2.0
langchain_openai: 0.2.3
langchain_text_splitters: 0.3.0

Optional packages not installed

langgraph
langserve

Other Dependencies

aiohttp: 3.9.5
async-timeout: 4.0.3
cohere: 5.11.1
dataclasses-json: 0.6.7
groq: 0.11.0
httpx: 0.27.0
jsonpatch: 1.33
numpy: 1.26.4
openai: 1.52.2
orjson: 3.10.4
packaging: 24.1
pandas: 2.2.3
pydantic: 2.9.2
pydantic-settings: 2.6.0
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
SQLAlchemy: 2.0.36
tabulate: 0.9.0
tenacity: 9.0.0
tiktoken: 0.7.0
typing-extensions: 4.12.2

@langcarl langcarl bot added the investigate Flagged for investigation. label Oct 24, 2024
@giuliaguglielmi123
Copy link

I've been trying to replicate your error without success, I can indeed correctly extract the vector embedding without error. However I noticed that you have python 3.10 whereas mine is 3.12.3 version.

Besides, why it tries to import transformers ? When I tried to replicate the issue it instead used sentence_transformers, did you install it ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
investigate Flagged for investigation.
Projects
None yet
Development

No branches or pull requests

2 participants