Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama-master-cf348a6-bin-win-avx-x64 之后的不能用! #1432

Closed
mingxing0769 opened this issue May 13, 2023 · 1 comment
Closed

llama-master-cf348a6-bin-win-avx-x64 之后的不能用! #1432

mingxing0769 opened this issue May 13, 2023 · 1 comment

Comments

@mingxing0769
Copy link

你好!
1,在llama.cpp目录下,直接make 生成的main.exe文件 打开模型时,输出回车后,会没反应,该如何修改编译?
2,llama-master-cf348a6-bin-win-avx-x64 以后的版本,加载模型时会提示错误。

Hello!

  1. In the llama.cpp directory, directly make the main.exe file to open the model. After pressing Enter, there will be no response. How to modify and compile?
  2. For versions after llama-master-cf348a6-bin-win-avx-x64, an error will be prompted when loading the model.
@Folko-Ven
Copy link
Contributor

Folko-Ven commented May 13, 2023

Hi. I think this is the same problem as #1423.
As a temporal solution use main.exe from Releases, it work fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants