You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a serious difference between the token limit of 400 that the model can process and the length of the sentence that the model can actually handle, for example, here is the text I sent to the model:
"But I didn't. I ran tunnels till I tapped the Arctic Ocean, and I sunk shafts till I broke through the roof of perdition but those extensions turned up missing every time I am willing to sell all that property and throw in the improvements
o Mr Fall, in Boston
ELMIRA, N Y July 20, 1871
FRIEND FALL, Redpath tells me to blow up Here goes I wanted you to
scare Rondout off with a big price 125 ain't big I got 100 the first"
and 400 tokens pass the check, but the model crashes afterwards
Is this really a bug caused by the incompatibility between the token here and the capacity of the model, or am I misunderstanding something?
Hi @nto4, I think it is related to the bug described in #2971. Even if the input text length is okay, it can cause this issue because the number of max output tokens was settled wrong. It was fixed on #3086 and soon this PR is merged this issue should be fixed.
Describe the bug
There is a serious difference between the token limit of 400 that the model can process and the length of the sentence that the model can actually handle, for example, here is the text I sent to the model:
"But I didn't. I ran tunnels till I tapped the Arctic Ocean, and I sunk shafts till I broke through the roof of perdition but those extensions turned up missing every time I am willing to sell all that property and throw in the improvements
o Mr Fall, in Boston
ELMIRA, N Y July 20, 1871
FRIEND FALL, Redpath tells me to blow up Here goes I wanted you to
scare Rondout off with a big price 125 ain't big I got 100 the first"
and 400 tokens pass the check, but the model crashes afterwards
Is this really a bug caused by the incompatibility between the token here and the capacity of the model, or am I misunderstanding something?
To Reproduce
Expected behavior
create audio without crash
Logs
Is this really a bug caused by the incompatibility between the token here and the capacity of the model, or am I misunderstanding something?
Additional context
No response
The text was updated successfully, but these errors were encountered: