Skip to content

This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).

Notifications You must be signed in to change notification settings

DefangChen/Knowledge-Distillation-Paper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

74 Commits
 
 

Repository files navigation

Awesome License: MIT

Knowledge-Distillation-Paper

This resposity maintains a collection of important papers on knowledge distillation.

Pioneering Papers

Survey Papers

Distillation Accelerates Diffusion Models

Extremely Promising !!!!!

Feature Distillation

Online Knowledge Distillation

Multi-Teacher Knowledge Distillation

Data-Free Knowledge Distillation

Distillation for Segmentation

  • Structured Knowledge Distillation for Dense Prediction, CVPR 2019, TPAMI 2020 [Pytorch]

  • Channel-wise Knowledge Distillation for Dense Prediction, ICCV 2021 [Pytorch]

  • Cross-Image Relational Knowledge Distillation for Semantic Segmentation, CVPR 2022 [Pytorch]

  • Holistic Weighted Distillation for Semantic Segmentation, ICME 2023 [Pytorch]

    • Wujie Sun, Defang Chen, Can Wang, Deshi Ye, Yan Feng, Chun Chen.

Useful Resources

  • Acceptance rates of the main AI conferences [Link]
  • AI conference deadlines [Link]
  • CCF conference deadlines [Link]

About

This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •