Title | DFDistill |
Authors | Ernest Nasyrov, Nikita Okhotnikov, Yuri Sapronov, Vladimir Solodkin |
This project focuses on implementing Data-Free Distillation in a simple and clear manner. Classical approaches to this problem perofrm distillation using logits, responses or hidden state from teacher obtained from data. However in some cases we cannot use the original data, and thus these methods become unapplicable. Our goal is to create a well-documented and efficient implementation for this complicated setting.
We plan to implement the following distillation techniques in our library:
- Data-Free Knowledge Distillation using Top Layer Activation Statistics
- Data-Free Knowledge Distillation using Spectral Methods
- Data-Free Adversarial Distillation
- Data-Free Knowledge Transfer via DeepInversion
The project is implemented using:
- PyPI
- PyTorch for tensor computation and differentiation
- Matplotlib for plotting
- Transformers
- Neptune for logging
- Aquvitae for distillation
You can install the required packages using pip:
- Clone the repository:
git clone <repository-url>
- Navigate to the cloned directory:
cd <repository-directory>
- Install the dependencies in editable mode:
pip install -e ./