We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ggerganov/llama.cpp#904 对于 https://github.com/ggerganov/llama.cpp/pull/820,加载的模型可能与基本模型不同。能够以交互方式将当前加载的模型导出到 binfile 是有意义的。
特别是如果允许线性插值多个 LoRA 文件的选项 - 即 LoRA 调酒学以获得独特的 LLM 个性.
@MillionthOdin16 百万之奥丁16评论 on Apr 12 如果您熟悉 loras 的混合,我认为如果您可以在上面链接一些资源,这对这里的很多人都会有所帮助。我听说你可以做一些很酷的事情,但我不是很熟悉。
@jon创 贡献 作者 钟创评论 on Apr 12 • 我不是那么熟悉,它只是我突然想到了一种可能性。
基本模型和 LoRA 之间的线性插值已经是一个标准功能: https://huggingface.co/docs/diffusers/main/en/training/lora
Anw,线性插值在这里跟踪:ggerganov/llama.cpp#905
此问题是关于将加载(和修改,例如微调)模型导出到 binfile。
The text was updated successfully, but these errors were encountered:
Hey Ziwanag! I really think that this is a good idea. I'm going to corral some folks (and myself) to prototype this under ggerganov/llama.cpp#905
Sorry, something went wrong.
No branches or pull requests
ggerganov/llama.cpp#904
对于 https://github.com/ggerganov/llama.cpp/pull/820,加载的模型可能与基本模型不同。能够以交互方式将当前加载的模型导出到 binfile 是有意义的。
特别是如果允许线性插值多个 LoRA 文件的选项 - 即 LoRA 调酒学以获得独特的 LLM 个性.
@MillionthOdin16
百万之奥丁16评论 on Apr 12
如果您熟悉 loras 的混合,我认为如果您可以在上面链接一些资源,这对这里的很多人都会有所帮助。我听说你可以做一些很酷的事情,但我不是很熟悉。
@jon创
贡献
作者
钟创评论 on Apr 12 •
我不是那么熟悉,它只是我突然想到了一种可能性。
基本模型和 LoRA 之间的线性插值已经是一个标准功能: https://huggingface.co/docs/diffusers/main/en/training/lora
Anw,线性插值在这里跟踪:ggerganov/llama.cpp#905
此问题是关于将加载(和修改,例如微调)模型导出到 binfile。
The text was updated successfully, but these errors were encountered: