Replies: 1 comment 2 replies
-
Okay, so IIUC, the core of the problem is that you would like to treat In theory, you can just manually set the The reason why we cannot simply allow |
Beta Was this translation helpful? Give feedback.
-
Okay, so IIUC, the core of the problem is that you would like to treat In theory, you can just manually set the The reason why we cannot simply allow |
Beta Was this translation helpful? Give feedback.
-
Hi all, consider the following model:
Suppose I want to create a LoRA adapter with the following properties:
FooModel.linear3
onlyFooModel.composite.linear2
inmodules_to_save
(i.e. leaves it trainable)FooModel.composite.linear1
inmodules_to_save
(i.e. freezes it)FooModel.composite.other_param
as trainablehow am I supposed to define my
LoraConfig
to achieve that? I tried different things, but every one has issues:modules_to_save
nn.Parameter
and not ann.Module
NOTE: I know one solution would be to wrap "other_param" in a
nn.Module
, but please assumeFooModel
is an existing implementation which I cannot change.this is the script I'm using to check which parameters are trainable after creating the adapter:
Beta Was this translation helpful? Give feedback.
All reactions