Skip to content

HuayuLiArizona/Knowledge-Distillation-Under-Ideal-Joint-Classifier-Assumption

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge-Distillation-Under-Ideal-Joint-Classifier-Assumption

PyTorch Implement of Knowledge distillation under ideal joint classifier assumption, accepted to Neural Networks, Volume 173, May 2024.

@article{LI2024106160,
title = {Knowledge distillation under ideal joint classifier assumption},
journal = {Neural Networks},
volume = {173},
pages = {106160},
year = {2024},
issn = {0893-6080},
doi = {https://doi.org/10.1016/j.neunet.2024.106160},
url = {https://www.sciencedirect.com/science/article/pii/S0893608024000844},
author = {Huayu Li and Xiwen Chen and Gregory Ditzler and Janet Roveda and Ao Li},
keywords = {Knowledge distillation, Teacher-student learning, Deep learning},
abstract = {Knowledge distillation constitutes a potent methodology for condensing substantial neural networks into more compact and efficient counterparts. Within this context, softmax regression representation learning serves as a widely embraced approach, leveraging a pre-established teacher network to guide the learning process of a diminutive student network. Notably, despite the extensive inquiry into the efficacy of softmax regression representation learning, the intricate underpinnings governing the knowledge transfer mechanism remain inadequately elucidated. This study introduces the ‘Ideal Joint Classifier Knowledge Distillation’ (IJCKD) framework, an overarching paradigm that not only furnishes a lucid and exhaustive comprehension of prevailing knowledge distillation techniques but also establishes a theoretical underpinning for prospective investigations. Employing mathematical methodologies derived from domain adaptation theory, this investigation conducts a comprehensive examination of the error boundary of the student network contingent upon the teacher network. Consequently, our framework facilitates efficient knowledge transference between teacher and student networks, thereby accommodating a diverse spectrum of applications.}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages