You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
aniketmaurya
changed the title
propagate Exception from inference workers during streaming to main process
propagate Exception from inference workers to main process
Jun 17, 2024
🐛 Bug
Exception
s are not propagated from inference workers to main process when using OpenAISpec. This results in silent failure.Code sample
This server will always give HTTP 200 response since the FastAPI
StreamingResponse
is sent before any actual computation is performed.Expected behavior
Environment
If you published a Studio with your bug report, we can automatically get this information. Otherwise, please describe:
conda
,pip
, source):Additional context
The text was updated successfully, but these errors were encountered: