Skip to content

intsystems/DFDistill

Repository files navigation

Data-Free Distillation

Title DFDistill
Authors Ernest Nasyrov, Nikita Okhotnikov, Yuri Sapronov, Vladimir Solodkin

Description

This project focuses on implementing Data-Free Distillation in a simple and clear manner. Classical approaches to this problem perofrm distillation using logits, responses or hidden state from teacher obtained from data. However in some cases we cannot use the original data, and thus these methods become unapplicable. Our goal is to create a well-documented and efficient implementation for this complicated setting.

Algorithms Implemented

We plan to implement the following distillation techniques in our library:

  • Data-Free Knowledge Distillation using Top Layer Activation Statistics
  • Data-Free Knowledge Distillation using Spectral Methods
  • Data-Free Adversarial Distillation
  • Data-Free Knowledge Transfer via DeepInversion

Related Work

Tech Stack

The project is implemented using:

  • PyPI
  • PyTorch for tensor computation and differentiation
  • Matplotlib for plotting
  • Transformers
  • Neptune for logging
  • Aquvitae for distillation

You can install the required packages using pip:

Installation

  1. Clone the repository:
    git clone <repository-url>
  2. Navigate to the cloned directory:
    cd <repository-directory>
  3. Install the dependencies in editable mode:
    pip install -e ./

Links


About

Data-free distillation package based on PyTorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published