From bf54136a79cc85b0e4c3915b4e1eb158f43c4b73 Mon Sep 17 00:00:00 2001 From: Steven Liu <59462357+stevhliu@users.noreply.github.com> Date: Fri, 12 Jan 2024 09:00:08 -0800 Subject: [PATCH] [docs] Docstring link (#1356) * fix format * hmm --- docs/source/developer_guides/lora.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/developer_guides/lora.md b/docs/source/developer_guides/lora.md index 604eb42853..14d04a6a4f 100644 --- a/docs/source/developer_guides/lora.md +++ b/docs/source/developer_guides/lora.md @@ -83,7 +83,7 @@ model = PeftModel.from_pretrained(base_model, peft_model_id) model.merge_and_unload() ``` -If you need to keep a copy of the weights so you can unmerge the adapter later or delete and load different ones, you should use the [`~tuners.tuner_utils.BaseTuner.merge_adapter`] function instead. Now you have the option to use [`~LoraModel.unmerge_adapter`] to return the base model. +If you need to keep a copy of the weights so you can unmerge the adapter later or delete and load different ones, you should use the [`~LoraModel.merge_adapter`] function instead. Now you have the option to use [`~LoraModel.unmerge_adapter`] to return the base model. ```py from transformers import AutoModelForCausalLM