-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create chat-13B.bat #592
Create chat-13B.bat #592
Conversation
Same script than chat-13B.sh, but for windows users. Tested and working on windows 10/11 v 22H2
Not sure if duplicating each example just for Windows is that much of a good idea. |
Yes, that's why I'm working on a single batch script to run all the models. At least for now, it can help Windows users to run the 13B model. Of course, it will have to be removed when it becomes redundant. In my opinion, making the llama.cpp project easy to run is at least as important as optimizing the core execution. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Script looks awesome and works great! 👍
The level of care taken and commenting is superb.
I like the way you can edit the .bat script or use them as environment variables too. like set MODEL=whatever before running the script. And that you try various path and also the command line argument as the script path.
This is definitely an improvement to current usability. Making the proper readmes and having helper scripts like this has stalled a bit currently as development is going fast. These are very important for helping newcomers.
However, there are some minor changes I'd suggest.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's ok for now to add new scripts and examples.
We will probably start pruning them in the future, but for now it is good
@anzz1 Merge this when you are OK with it
Thanks @anzz1 for your remarks and code improvements. I agree with everything ! |
Thank you for your contribution! 👍 |
Same script than
examples/chat-13B.sh
, but for windows users. Tested and working on windows 10/11 v 22H2The next step for me would be to create a single batch script where you choose the LLaMA model and the first context. Then run it in interactive mode. My goal is to allow any windows user to run the model in the easiest way possible.
Regards,