Skip to content

Conversation

@awinml
Copy link

@awinml awinml commented Dec 24, 2023

Related Issues:

fixes #1939

Proposed Changes:

Adds two new optional boolean parameters parameters to InferenceClient.feature_extraction():

  • truncate: If set to True, truncates inputs longer than 512 tokens. Defaults to True.
  • normalize: If set to true, returned vectors will have length 1. Defaults to True.

This enables using these parameters with endpoints set up using text-embeddings-inference.

@Wauplin
Copy link
Contributor

Wauplin commented Jan 2, 2024

Thanks for opening a PR @awinml! The contribution is appreciated but I'd prefer to delay it for now as explained in #1939 (comment).

@awinml
Copy link
Author

awinml commented May 3, 2024

Closing in favour of #2270.

@awinml awinml closed this May 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add support for truncate and normalize to InferenceClient.feature_extraction()

2 participants