You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @avostryakov
This should be fixed in #583
Also duplicate of : #581
Please re-install peft from source and the error should disappear, closing the issue for now - feel free to re-open in case you think that it has not been addressed
adalora.py, line 443:
result += ( ( self.lora_dropout[self.active_adapter](x) @ (self.lora_A[self.active_adapter] * self.lora_E[self.active_adapter]).T @ self.lora_B[self.active_adapter].T ) * self.scaling[self.active_adapter] / (self.ranknum[self.active_adapter] + 1e-5) )
I got the following error because self.ranknum[self.active_adapter] is on CPU and other tensors on GPU:
I used the following AdaLoraConfig:
config = AdaLoraConfig( r=8, target_r=8, init_r=16, tinit=200, tfinal=1000, lora_alpha=32, target_modules=target_modules, lora_dropout=0.05, bias="none", task_type="CAUSAL_LM" )
The text was updated successfully, but these errors were encountered: