Skip to content

Latest commit

 

History

History
13 lines (8 loc) · 1.24 KB

weight_update.md

File metadata and controls

13 lines (8 loc) · 1.24 KB

Weight Update

Updating the pretrained LLMs' weights can come in different forms, such as

  • retraining: during pruning, retrain or fine-tune on datasets to minimize training loss.
  • minimize reconstruction error: updating weights by minimizing reconstruction pruning error between non-pruned and pruned models, without any retraining. For example, SparseGPT updates weights by solving a layer-wise reconstruction error, without retraining.
Retrain Explanation
Frozen Pre-trained weights of LLMs keep fixed.
Update Update the weights by retraining or minimize reconstruction error.