Skip to content

Implementation of Word Embedding Distillation with Ensemble Learning

Notifications You must be signed in to change notification settings

bgshin/distill_demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Introduction

  • Official Implementation of the paper, "The Pupil Has Become the Master: Teacher-Student Model-Based Word Embedding Distillation with Ensemble Learning", B. Shin, H. Yang, and J.D. Choi, IJCAI 2019.
  • Author: Bonggun Shin

Download Data

  • Unzip Data into 'data' folder
distill_demo/data/sentiment_analysis/*
distill_demo/data/w2v/*

Requirements

pip install tensorflow
pip install keras
pip install gensim
pip install sklearn

Train teachers

cd src/
python train_teacher.py -ds sst5 -m cnn2 -t 0
python train_teacher.py -ds sst5 -m cnn2 -t 1
...
python train_teacher.py -ds sst5 -m cnn2 -t 9

python train_teacher.py -ds sst5 -m lstm -t 0
python train_teacher.py -ds sst5 -m lstm -t 1
...
python train_teacher.py -ds sst5 -m lstm -t 9

Train an autoencoder

cd src/
python train_ae.py

Distill without ensemble

cd src/
python distill.py

Distill with ensemble

cd src/
python extract_logits.py
python distill.py

About

Implementation of Word Embedding Distillation with Ensemble Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages