Skip to content

Files

Latest commit

3e5a4b9 · Apr 17, 2020

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
Apr 17, 2020
Apr 17, 2020
Apr 17, 2020
Apr 17, 2020

Readme.md

Weight Initialization Strategies

This project talks about how you can initialize weights for your neural network for better accuracy. The below table summarizes results of using various weights and their compares them according to training and validation loss.

PyTorch implementation : weight_initializaion_strategies

Results and Conclusion

Weight Initialization Strategy Comments Results
Uniform weight initialization with 0s and 1s The neural network has a hard time determining which
weights need to be changed,
since the neurons have the same output
for each layer
Uniform distribution between 0.0 and 1.0 Better than case-1. Neural network
starts to learn.
General Rule Model learns perfectly and training loss decreases gradually
Normal distribution v/s general rule Performs similar to general rule. Model
learns effectively.
No weight initialization Unexpected behaviour. PyTorch has its own default weight initialization strategy