You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I host an Image-to-Text pipeline with this model for a while. The Inference API widget worked quite well until recently when some developers reported that the inference widget always cut the response short. However, he ran the model locally with the same example, and the response was perfect. I tried previously successful examples but found all examples returned the same truncated results.
I didn't update model weights and configs before trying to fix this issue. Please check the issue for more details.
These are all commits I made after reading the issue:
Following the advice mentioned in the issue, I first tried to add:
inference:
parameters:
max_length: 800
in the model card. But it doesn't work.
Then, I guessed maybe the encoder config confused the API, so I tried to edit encoder.max_length=800. But it still failed to fix the problem, thenI edited it back.
I speculated that it is an inference API bug that causes the truncated responses.
The text was updated successfully, but these errors were encountered:
Issue Description
I host an Image-to-Text pipeline with this model for a while. The Inference API widget worked quite well until recently when some developers reported that the inference widget always cut the response short. However, he ran the model locally with the same example, and the response was perfect. I tried previously successful examples but found all examples returned the same truncated results.
I didn't update model weights and configs before trying to fix this issue. Please check the issue for more details.
These are all commits I made after reading the issue:
in the model card. But it doesn't work.
encoder.max_length=800
. But it still failed to fix the problem, thenI edited it back.I speculated that it is an inference API bug that causes the truncated responses.
The text was updated successfully, but these errors were encountered: