-
Notifications
You must be signed in to change notification settings - Fork 11.5k
Unable to enter Chinese prompt #646
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
have you tried this https://github.com/ymcui/Chinese-LLaMA-Alpaca? |
Yes, this is the model I use,I ran the command in git bash and it seems works fine? |
Let me provide more information about this issue. Hope this helps to solve the problem. @LainNya found out that: It seems that the This is the detailed cause and solution I found, but I can't tell since I'm not familiar with C++: I'm not sure if this should be chalked up to a Windows terminal compatibility issue with C++. Anyway, use |
I modified the implementation of the getline part, rewrote a simple Here is the repo: https://github.com/josStorer/llama.cpp-unicode-windows |
This problem for unicode characters had been fixed here for me in the following PR #420 , but I confirm that it is currently not possible to insert or display utf-8 characters at moment. |
#840 has been merged - try pulling the latest master and please test whether this fixed your issue |
LGTM after replace the latest binaries in https://github.com/ggerganov/llama.cpp/releases/tag/master-aaf3b23 |
Hi!My use is compiled under Windows main.exe, when I type Chinese Prompt, I found that the model seems to be unable to understand, under debugging found that std::getline(std::cin,line) get is empty lines, then I tried Japanese, are the same result.

(Since I am a native Chinese speaker, this question was translated by DeepL)
The text was updated successfully, but these errors were encountered: