You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is the bug?
We have changed to async http client in 2.14 (#1839), tested with 2.14 RC4, find the predict result for text embedding model not same with 2.13.
How can one reproduce the bug?
Steps to reproduce the behavior:
The issue is caused by the refactor of the httpclient from sync to async, when using sync httpclient with user defined preprocess function, a list of string input will be processed in sequential order, user script always pick up the first element in the list, and the list shrinks when a prediction is done.
With async httpclient approach, we need to calculate the total chunks before sending any request to remote model end, but the code doesn't handle user defined script case well, the request is been regarded as a batch request to remote model. So we need to check if user defined script is shown, we need to calculate the chunks in different way, this PR fixed this issue: #2418
What is the bug?
We have changed to async http client in 2.14 (#1839), tested with 2.14 RC4, find the predict result for text embedding model not same with 2.13.
How can one reproduce the bug?
Steps to reproduce the behavior:
2.13 result
2.14 result
What is the expected behavior?
We should not break BWC. Should return same result of 2.13.
What is your host/environment?
Do you have any screenshots?
If applicable, add screenshots to help explain your problem.
Do you have any additional context?
Add any other context about the problem.
The text was updated successfully, but these errors were encountered: