The easiest Knowledge Distillation library for Light Weight DeepLearning


Keywords
tensorflow, pytorch, pytorch-ignite, tqdm, deep-learning, knowledge-distillation, light-weight, machine-learning, model-compression
License
MIT
Install
pip install aquvitae==0.2.2

Documentation

TF Depend TORCH Depend License Badge

AquVitae: The Easiest Knowledge Distillation Library

AquVitae is a Python library that is the easiest to perform Knowledge Distillation through a very simple API. This library supports TensorFlow and PyTorch. Knowledge Distillation is the most representative model compression technology along with Weight Running and Quantization. This library has a popular and diverse Knowledge Distillation algorithm. If the Deep Learning model used in your project is too heavy, you can use AquVitae to make the speed very fast with little loss of performance.

Getting Started

In AquVitae, you only need to call the function once for Knowledge Distillation.

from aquvitae import dist, ST

# Load the dataset
train_ds = ...
test_ds = ...

# Load the teacher and student model
teacher = ...
student = ...

optimizer = ...

# Knowledge Distillation
student = dist(
    teacher=teacher,
    student=student,
    algo=ST(alpha=0.6, T=2.5),
    optimizer=optimizer,
    train_ds=train_ds,
    test_ds=test_ds,
    iterations=3000
)

Installation

$ pip install aquvitae

Algorithms

List of Knowledge Distillation Algorithms implemented in AquVitae.

Algo HP Paper TF TORCH
ST alpha, T Distilling the Knowledge in a Neural Network ✔️ ✔️
DML - Deep Mutual Learning - -
FitNets - FitNets: Hints for Thin Deep Nets - -
RKD - Relational Knowledge Distillation - -

License

Copyright © marload

AquVitae is open-sourced software licensed under the MIT License.