-
Notifications
You must be signed in to change notification settings - Fork 11.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python3 script instead of bash #184
Comments
I made a CMD script, but Python is more sensible considering it's already a requirement. @(
SETLOCAL EnableDelayedExpansion
ECHO ---------------------------------------------------------------
ECHO convert and quantize facebook LLaMA models for use by llama.cpp
ECHO ---------------------------------------------------------------
REM directory containing original facebook models
set "SRC=Y:\LLaMA"
REM directory containing ggml model files - both f16 and q4
set "DEST=."
REM free disk space by deleting ggml f16 models after quantization
REM set DELETE_F16=1
ECHO Starting ... This could take a couple hours! ...
REM todo: quantize in parallel
REM stop if any model files exist, DO NOT OVERWRITE
IF NOT EXIST "!DEST!\llama-7B\ggml-model*.bin*" (
python ..\convert-pth-to-ggml.py !SRC!\7B\ 1
md !DEST!\llama-7B 2> NUL
move !SRC!\7B\ggml-model-f16.bin !DEST!\llama-7B\ggml-model.bin
quantize !DEST!\llama-7B\ggml-model.bin !DEST!\llama-7B\ggml-model-q4_0.bin 2
IF DEFINED DELETE_F16 del !DEST!\llama-7B\ggml-model.bin
) ELSE (
ECHO remove model files from "!DEST!\llama-7B" to re-generate.
DIR /B "!DEST!\llama-7B\ggml-model*.bin*"
ECHO ---------------------------------------------------------
)
IF NOT EXIST "!DEST!\llama-13B\ggml-model*.bin*" (
python ..\convert-pth-to-ggml.py !SRC!\13B\ 1
md !DEST!\llama-13B 2> NUL
move !SRC!\13B\ggml-model-f16.bin !DEST!\llama-13B\ggml-model.bin
move !SRC!\13B\ggml-model-f16.bin.1 !DEST!\llama-13B\ggml-model.bin.1
quantize !DEST!\llama-13B\ggml-model.bin !DEST!\llama-13B\ggml-model-q4_0.bin 2
quantize !DEST!\llama-13B\ggml-model.bin.1 !DEST!\llama-13B\ggml-model-q4_0.bin.1 2
IF DEFINED DELETE_F16 del !DEST!\llama-13B\ggml-model.bin*
) ELSE (
ECHO remove model files from "!DEST!\llama-13B" to re-generate.
DIR /B "!DEST!\llama-13B\ggml-model*.bin*"
ECHO ---------------------------------------------------------
)
IF NOT EXIST "!DEST!\llama-30B\ggml-model*.bin*" (
python ..\convert-pth-to-ggml.py !SRC!\30B\ 1
md !DEST!\llama-30B 2> NUL
move !SRC!\30B\ggml-model-f16.bin !DEST!\llama-30B\ggml-model.bin
move !SRC!\30B\ggml-model-f16.bin.1 !DEST!\llama-30B\ggml-model.bin.1
move !SRC!\30B\ggml-model-f16.bin.2 !DEST!\llama-30B\ggml-model.bin.2
move !SRC!\30B\ggml-model-f16.bin.3 !DEST!\llama-30B\ggml-model.bin.3
quantize !DEST!\llama-30B\ggml-model.bin !DEST!\llama-30B\ggml-model-q4_0.bin 2
quantize !DEST!\llama-30B\ggml-model.bin.1 !DEST!\llama-30B\ggml-model-q4_0.bin.1 2
quantize !DEST!\llama-30B\ggml-model.bin.2 !DEST!\llama-30B\ggml-model-q4_0.bin.2 2
quantize !DEST!\llama-30B\ggml-model.bin.3 !DEST!\llama-30B\ggml-model-q4_0.bin.3 2
IF DEFINED DELETE_F16 del !DEST!\llama-30B\ggml-model.bin*
) ELSE (
ECHO remove model files from "!DEST!\llama-30B" to re-generate.
DIR /B "!DEST!\llama-30B\ggml-model*.bin*"
ECHO ---------------------------------------------------------
)
IF NOT EXIST "!DEST!\llama-65B\ggml-model*.bin*" @(
python ..\convert-pth-to-ggml.py !SRC!\65B\ 1
md !DEST!\llama-65B 2> NUL
move !SRC!\65B\ggml-model-f16.bin !DEST!\llama-65B\ggml-model.bin
move !SRC!\65B\ggml-model-f16.bin.1 !DEST!\llama-65B\ggml-model.bin.1
move !SRC!\65B\ggml-model-f16.bin.2 !DEST!\llama-65B\ggml-model.bin.2
move !SRC!\65B\ggml-model-f16.bin.3 !DEST!\llama-65B\ggml-model.bin.3
move !SRC!\65B\ggml-model-f16.bin.4 !DEST!\llama-65B\ggml-model.bin.4
move !SRC!\65B\ggml-model-f16.bin.5 !DEST!\llama-65B\ggml-model.bin.5
move !SRC!\65B\ggml-model-f16.bin.6 !DEST!\llama-65B\ggml-model.bin.6
move !SRC!\65B\ggml-model-f16.bin.7 !DEST!\llama-65B\ggml-model.bin.7
quantize !DEST!\llama-65B\ggml-model.bin !DEST!\llama-65B\ggml-model-q4_0.bin 2
quantize !DEST!\llama-65B\ggml-model.bin.1 !DEST!\llama-65B\ggml-model-q4_0.bin.1 2
quantize !DEST!\llama-65B\ggml-model.bin.2 !DEST!\llama-65B\ggml-model-q4_0.bin.2 2
quantize !DEST!\llama-65B\ggml-model.bin.3 !DEST!\llama-65B\ggml-model-q4_0.bin.3 2
quantize !DEST!\llama-65B\ggml-model.bin.4 !DEST!\llama-65B\ggml-model-q4_0.bin.4 2
quantize !DEST!\llama-65B\ggml-model.bin.5 !DEST!\llama-65B\ggml-model-q4_0.bin.5 2
quantize !DEST!\llama-65B\ggml-model.bin.6 !DEST!\llama-65B\ggml-model-q4_0.bin.6 2
quantize !DEST!\llama-65B\ggml-model.bin.7 !DEST!\llama-65B\ggml-model-q4_0.bin.7 2
IF DEFINED DELETE_F16 del !DEST!\llama-65B\ggml-model.bin*
) ELSE (
ECHO remove model files from "!DEST!\llama-65B" to re-generate.
DIR /B "!DEST!\llama-65B\ggml-model*.bin*"
ECHO ---------------------------------------------------------
)
) Sometimes I just use the tool where I'm at - not the brightest choice. 😃 |
Consider using something like |
This change replaces the use of os.system in the Python script with subprocess.run, improving script security by preventing potential command injection issues in file names.
Although
I hope this can help you! |
Already done in #222 |
The text was updated successfully, but these errors were encountered: