This is some unit test for ML algorithms.
This is my origin implementation of SimCLR.
The experiment is made on mini-imagenet dataset.(If the task is too easy, there will be no different between using SimCLR and not)
My experiment shows there's difference on 3 layers convolution network, but no difference for restnet20.
This is result of 3 layers conv version, showing improvment with SimCLR:
This is resultof of resnet20 version, showing no difference betweeen using SimCLR or not:
make the nerual network to know coordinate by channel of up-down and left-right.
The easiest way to implement discrete latent variable. I know it by the paper of model-based learning paper.
For updating the codebook, there're two different approach: one is normal gradient update; another one is Exponential Moving Average. The implementation of EMA is pretty lousy. But the result is astonishing: it convergs much faster since it wouldn't affect by bad gradient of decoder and encoder.
I adapt the decomposed trick for ease the index collapse problem. Decomposed trick also accelerate converge.
This is result of my experiment: