AquVitae: The Easiest Knowledge Distillation Library
AquVitae is a Python library that is the easiest to perform Knowledge Distillation through a very simple API. This library supports TensorFlow and PyTorch. Knowledge Distillation is the most representative model compression technology along with Weight Running and Quantization. This library has a popular and diverse Knowledge Distillation algorithm. If the Deep Learning model used in your project is too heavy, you can use AquVitae to make the speed very fast with little loss of performance.
In AquVitae, you only need to call the function once for Knowledge Distillation.
from aquvitae import dist, ST # Load the dataset train_ds = ... test_ds = ... # Load the teacher and student model teacher = ... student = ... optimizer = ... # Knowledge Distillation student = dist( teacher=teacher, student=student, algo=ST(alpha=0.6, T=2.5), optimizer=optimizer, train_ds=train_ds, test_ds=test_ds, iterations=3000 )
$ pip install aquvitae
List of Knowledge Distillation Algorithms implemented in AquVitae.
||Distilling the Knowledge in a Neural Network|
|DML||-||Deep Mutual Learning||-||-|
|FitNets||-||FitNets: Hints for Thin Deep Nets||-||-|
|RKD||-||Relational Knowledge Distillation||-||-|
Copyright © marload