vae-mixin-pytorch#

Variational autoencoders as mixins.

This repo contains implementation of variational autoencoder (VAE) and variants in PyTorch as mixin classes, which can be reused and composed in your customized modules.

Usage#

Check the docs here.

An example using simple encoder and decoder on the MNIST dataset is in example.py.

Note

Mixin is a term in object-oriented programming.

Notes#

Implemented VAEs:

  • VAE

  • beta-VAE

  • InfoVAE

  • DIP-VAE

  • \(\beta\)-TCVAE

  • VQ-VAE

Note

Losses are averaged across samples, and summed along each latent vector in a minibatch.