Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use "instruct" mode when using the API to send prompts to an LLM #6360

Open
1 task done
aaron13100 opened this issue Sep 2, 2024 · 0 comments
Open
1 task done
Labels
bug Something isn't working

Comments

@aaron13100
Copy link

Describe the bug

When using the API to interact with the text-generation-webui, I am unable to get the LLM to operate in "instruct" mode. The responses I receive are consistent with "chat" or "chat-instruct" mode, even when I explicitly try to set the mode to "instruct". This prevents me from using the LLM in a way that directly follows instructions without engaging in conversation.

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

  1. Set up text-generation-webui with the Meta-Llama-3.1-8B-Instruct-Q8_0.gguf model.
  2. Use the API to send the following sequence of prompts:
    a. "translate everything I say to french."
    b. "I like cake."
  3. Observe that the response to the second prompt is conversational rather than a direct French translation.

Expected behavior: In "instruct" mode, the response to "I like cake." should be "J'aime le gateau."
Actual behavior: The LLM responds with conversational text about liking cake, indicating it's in "chat" or "chat-instruct" mode.

Screenshot

No response

Logs

no errors

System Info

MacOS 14.6.1 (23G93)
Apple M3 Max
@aaron13100 aaron13100 added the bug Something isn't working label Sep 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant