This is the implementation of Zichao Yang's Improved Variational Autoencoders for Text Modeling using Dilated Convolutions with Kim's Character-Aware Neural Language Models embedding for tokens
Most of the implementations about the recurrent variational autoencoder are adapted from analvikingur/pytorch_RVAE
$ python train_word_embeddings.py
This script train word embeddings defined in Mikolov et al. Distributed Representations of Words and Phrases
--use-cuda
--num-iterations
--batch-size
--num-sample
–– number of sampled from noise tokens
$ python train.py
--use-cuda
--num-iterations
--batch-size
--learning-rate
--dropout
–– probability of units to be zeroed in decoder input
--use-trained
–– use trained before model
$ python sample.py
--use-cuda
--num-sample