-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
epic: format generated code in code format (not plain text) #2913
Comments
hi @xx88xx, sorry, is this what you mean? |
I also sometimes run into this issue. Does anyone know if different models maybe use different syntax for things like code blocks? Like how we have multiple different prompt templates/tokens? |
Hi @Van-QA ! |
thank you, can you let us know more details of the failed model / step to reproduce? @xx88xx |
With some further testing/hindsight, I think I only experienced this when using Llama3/fine-tunes released in the first week or so, where there were issues with the tokens or something, e.g.
Recently I haven't ran into it again. So it's likely some kind of syntax/formatting-specific thing, as previously suspected. No idea if this was also the case for @xx88xx? |
Can I work on this issue |
YES PLEASE :"> @Realmbird |
Assigned @Van-QA to follow up when needed. |
Hey @Realmbird how is it going so far |
Motivation
Seems all the other LLMs do this now, so would be very useful comfort to have in Jan!
The text was updated successfully, but these errors were encountered: