-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using a WSL based Docker, run the llama.cpp container, load the quantified Chinese alpha plus model, and the terminal will continue to output a carriage return after inputting Chinese #1649
Comments
Does this only happen with the docker image? It doesn't happen if you compile it yourself or use one of the official binaries? Either way, I believe I've seen this before. Either the docker isn't configured to use the chinese locale or it doesn't contain it. I think the first is more likely. Can you try setting LC_ALL to "zh-CN.UTF-8" and see if it fixes the problem? You can set the environment variable at runtime using the docker run -e LC_ALL=zh-CN.UTF-8 -v /path/to/models:/models ghcr.io/ggerganov/llama.cpp:full --all-in-one "/models/" 7B docker run -e LC_ALL=zh-CN.UTF-8 -v /path/to/models:/models ghcr.io/ggerganov/llama.cpp:full --run -m /models/7B/ggml-model-q4_0.bin -p "Building a website can be done in 10 simple steps:" -n 512 docker run -e LC_ALL=zh-CN.UTF-8 -v /path/to/models:/models ghcr.io/ggerganov/llama.cpp:light -m /models/7B/ggml-model-q4_0.bin -p "Building a website can be done in 10 simple steps:" -n 512 |
Thank you very much. After installing the corresponding language pack, I tested C.utf8 and zh_ The CN.utf8 character set, they all work properly. |
You aren't the first with this issue. Are you able to provide the steps you followed on how you fixed this so I can include it in the documentation? 您并不是第一个遇到这个问题的人。您能提供解决这个问题所遵循的步骤吗?这样我可以将它包含在文档中。 |
Of course, I am currently organizing and testing, and I will organize a document on loading Chinese models in a container environment using llama.cpp, using Chinese-alpaca-plus as an example. 当然,我目前还在整理和测试,之后我会整理一份llama.cpp在容器环境下加载中文模型的文档,以Chinese-alpaca-plus为例。 |
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
Normal conversation in Chinese
Same operation, it is normal in WSL, I don't know what is missing in WSL based Docker container
Current Behavior
The Docker container environment has been configured to support Chinese character sets, and Modified the user input configuration file, and the bash terminal itself can support Chinese input and display.
Using a WSL based Docker, run the llama.cpp container, load the quantified Chinese-alpha-plus model, and the terminal will continue to output a carriage return after inputting Chinese
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
Docker container based on WSL
Failure Information (for bugs)
Using a WSL based Docker, run the llama.cpp container, load the quantified Chinese-alpha-plus model, and the terminal will continue to output a carriage return after inputting Chinese
Steps to Reproduce
Docs:
https://github.com/ymcui/Chinese-LLaMA-Alpaca/wiki/llama.cpp%E9%87%8F%E5%8C%96%E9%83%A8%E7%BD%B2
https://github.com/ggerganov/llama.cpp#docker
The text was updated successfully, but these errors were encountered: