Skip to content

ArchipLab-LinfengZhang/Task-Oriented-Feature-Distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Task-Oriented Feature Distillation

This is the implementation of Task-Oriented Feature Distillation, NeurIPS2020.

Experiments on CIFAR100

Step1. Install the required packages.

pip install torch torchvision

Step2. Train a student model.

python distill.py --model=resnet18

Note that you can choose resnet, senet and preactresnet models as students. Each model has five kinds of depth - 18, 34, 50, 101, and 152.

About

This is the implementaion of paper "Task-Oriented Feature Distillation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages