Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

epic: format generated code in code format (not plain text) #2913

Closed
xx88xx opened this issue May 16, 2024 · 10 comments
Closed

epic: format generated code in code format (not plain text) #2913

xx88xx opened this issue May 16, 2024 · 10 comments
Labels

Comments

@xx88xx
Copy link

xx88xx commented May 16, 2024

Motivation

Seems all the other LLMs do this now, so would be very useful comfort to have in Jan!

@xx88xx xx88xx added the type: epic A major feature or initiative label May 16, 2024
@Van-QA
Copy link
Contributor

Van-QA commented May 16, 2024

hi @xx88xx, sorry, is this what you mean?
image
The copy button will help copy code in the same format that is being displayed 🙏

@Van-QA Van-QA added the needs info Not enough info, more logs/data required label May 16, 2024
@mr-september
Copy link

I also sometimes run into this issue. Does anyone know if different models maybe use different syntax for things like code blocks? Like how we have multiple different prompt templates/tokens?

@xx88xx
Copy link
Author

xx88xx commented May 20, 2024

Hi @Van-QA !
Thanks for the response. Since posting this, noticed it works in the more popular models. I agree with @mr-september sometimes it works as in the picture you attached, but often especially on some of the more bespoke models it shows up unformatted. @mr-september might be onto something with the different syntax.

@Van-QA
Copy link
Contributor

Van-QA commented May 22, 2024

thank you, can you let us know more details of the failed model / step to reproduce? @xx88xx

@mr-september
Copy link

mr-september commented Jun 5, 2024

With some further testing/hindsight, I think I only experienced this when using Llama3/fine-tunes released in the first week or so, where there were issues with the tokens or something, e.g.

  1. Special tokens are not rendered correctly (as empty) -- llama3 specific? ggerganov/llama.cpp#6770
  2. https://www.reddit.com/r/LocalLLaMA/comments/1cltac3/part3_cause_to_issue_found_possible_bug_llama3/

Recently I haven't ran into it again. So it's likely some kind of syntax/formatting-specific thing, as previously suspected. No idea if this was also the case for @xx88xx?

@Realmbird
Copy link
Contributor

Can I work on this issue

@imtuyethan
Copy link
Contributor

imtuyethan commented Jul 2, 2024

Can I work on this issue

YES PLEASE :"> @Realmbird

@imtuyethan
Copy link
Contributor

Assigned @Van-QA to follow up when needed.

@imtuyethan imtuyethan moved this to Icebox in Menlo Jul 2, 2024
@Van-QA Van-QA assigned Realmbird and unassigned Van-QA Jul 3, 2024
@imtuyethan
Copy link
Contributor

Hey @Realmbird how is it going so far

@imtuyethan imtuyethan added type: feature request A new feature and removed needs info Not enough info, more logs/data required type: epic A major feature or initiative labels Aug 28, 2024
@imtuyethan imtuyethan removed the status in Menlo Aug 28, 2024
@Realmbird Realmbird removed their assignment Aug 28, 2024
@imtuyethan imtuyethan moved this to Icebox in Menlo Sep 2, 2024
@freelerobot
Copy link
Contributor

Works for me on v0.5.3+
closing
image

@github-project-automation github-project-automation bot moved this from Icebox to Completed in Menlo Sep 5, 2024
@imtuyethan imtuyethan modified the milestone: v0.5.4 Sep 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Archived in project
Development

No branches or pull requests

6 participants