-
Hello, I am encountering an issue when trying to use a dedicated inference endpoint on HuggingFace. I'm not sure where the error is, as I triple checked that I am indeed passing a list of Documents to the component. The embedder initializes just fine, but when I run it, I receive a 400 Bad Request error.
Returns
Not sure if this is an issue with Haystack or HuggingFace, so I would appreciate any guidance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @hodge-jai I tried the code example from our documentation page here: https://docs.haystack.deepset.ai/docs/huggingfaceapidocumentembedder#using-free-serverless-inference-api and it worked for me from haystack.components.embedders import HuggingFaceAPIDocumentEmbedder
from haystack.utils import Secret
from haystack.dataclasses import Document
doc = Document(content="I love pizza!")
document_embedder = HuggingFaceAPIDocumentEmbedder(api_type="serverless_inference_api",
api_params={"model": "BAAI/bge-small-en-v1.5"},
token=Secret.from_token("<your-api-key>"))
result = document_embedder.run([doc])
print(result["documents"][0].embedding)
# [0.017020374536514282, -0.023255806416273117, ...] (just had to rename doc_embedder to document_embedder, for which I created an issue here ). |
Beta Was this translation helpful? Give feedback.
I only realized my issue a bit later and it was purely user error 🤦
I had to configure the
Task
option properly in the HuggingFace paid inference endpoint setting panel to be in line with what I was passing to the endpoint. One doesn't need to change this setting when usingapi_type=serverless_inference_api
Thanks for your reply!