Skip to content

Latest commit

 

History

History
29 lines (20 loc) · 720 Bytes

README.md

File metadata and controls

29 lines (20 loc) · 720 Bytes

vae-mixin-pytorch

Variational autoencoders as mixins.

This repo contains implementation of variational autoencoder (VAE) and variants in PyTorch as mixin classes, which can be reused and composed in your customized modules.

Usage

Check the docs here.

An example using simple encoder and decoder on the MNIST dataset is in example.py.

Mixin is a term in object-oriented programming.

Notes

Implemented VAEs:

  • VAE
  • beta-VAE
  • InfoVAE
  • DIP-VAE
  • $\beta$-TCVAE
  • VQ-VAE
Losses are averaged across samples, and summed along each latent vector in a minibatch.