-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing spaces #46
Comments
this code is working for me. triton-inference-server/tensorrtllm_backend#332 (comment) duplicate #30 |
@dongs0104 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I have converted Mixtral to TensoRT and I am trying to use your repository to integrate with OpenAI.
I'm using the template history_template_llama3.liquid. When I run your example code for interacting with the model (openai_completion.py and openai_completion_stream.py)
If I contact triton directly via the http protocol, then I receive the following response to the same request:
"text_output":"to the moon and back.\n\nThe story begins with a young boy named Neil Armstrong who loved to explore and learn about the world around him. He was fascinated by the stars and the moon and dreamed of one day going to space"
How do I add all the spaces as in http protocol?
The text was updated successfully, but these errors were encountered: