Skip to content

Ability to take in a config file as initial prompt #141

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
MLTQ opened this issue Mar 14, 2023 · 4 comments
Closed

Ability to take in a config file as initial prompt #141

MLTQ opened this issue Mar 14, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@MLTQ
Copy link

MLTQ commented Mar 14, 2023

Following on to the "Store preprocessed prompts", it would be good to be able to take in a text file with a generic prompt & flags to start a chatbot or similar.
Such a config file could be a yaml or toml and include flags for running, model locations, prompt locations, etc.

@bitRAKE
Copy link
Contributor

bitRAKE commented Mar 15, 2023

On Windows I just use CMD files. I can just drop a prompt file on the CMD script, or use the terminal. Here is an example:

@REM explicitly state all possible parameters:
if NOT [%1]==[] llama.exe ^
	--model			"llama-30B\ggml-model-q4_0.bin" ^
	--seed			1973	^
	--threads		15	^
	--n_predict		1024	^
	--repeat_last_n		64	^
	--batch_size		8	^
	--repeat_penalty	1.4	^
	--top_k			40	^
	--top_p			0.9	^
	--temp			0.8	^
	--file %1 2>>llama.stderr.txt

Then I have another one setup for interactive mode - "User:" style prompt files, etc.

@nschulzke
Copy link

In bash, you can just load a prompt directly into the string:

./main -m ./models/65B/ggml-model-q4_0.bin -t 8 -n 256 -p "$(<FILE_NAME_HERE)"

@MLTQ
Copy link
Author

MLTQ commented Mar 15, 2023

Thanks @nschulzke, I have been doing this, but I would ideally want to do something like:

./main and it look for a config file, and if it finds it, use it.

@gjmulder gjmulder added the enhancement New feature or request label Mar 15, 2023
@saites
Copy link

saites commented Mar 18, 2023

@nschulzke

In bash, you can just load a prompt directly into the string:

./main -m ./models/65B/ggml-model-q4_0.bin -t 8 -n 256 -p "$(<FILE_NAME_HERE)"

You could also just use -f/--file.

But for @MLTQ, I think using a driver script is really the way to go, especially if you want to use a configuration language like YAML. I wrote a Python script to do exactly that. You're welcome to modify it to suit your purposes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants