Skip to content

dhimiter49/mlp_bandwidth

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MLP Bandwidth (Complexity) Setting with Dropout

This repo implements a FixedDropout torch layer and a WeightDropLinear linear layer. The latter drops weights an biases of a linear layer during the forward pass, using the given probability, whereas the former implements a dropout layer with variable probability specified during the forward pass.

Both bandwidth manipulation methods are implemented as part of a MLP with a single hidden layer for sinus function prediction. FixedDropout does not work as intended because dropping activation layers results in a non-continuous function prediction.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages