You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which tokenizer.model is needed for GPT4ALL for use with convert-gpt4all-to-ggml.py? Is it the one for LLaMA 7B? It is unclear from the current README and gpt4all-lora-quantized.bin seems to be typically distributed without the tokenizer.model file.
The text was updated successfully, but these errors were encountered:
Which tokenizer.model is needed for GPT4ALL for use with convert-gpt4all-to-ggml.py? Is it the one for LLaMA 7B? It is unclear from the current README and gpt4all-lora-quantized.bin seems to be typically distributed without the tokenizer.model file.
The text was updated successfully, but these errors were encountered: