hypergrad

Simple and extensible hypergradient for PyTorch


License
MIT
Install
pip install hypergrad==0.0.1

Documentation

hypergrad

pytest PyPI - Version PyPI - Python Version document

Simple and extensible hypergradient for PyTorch

Installation

First, install torch and its accompanying torchvision appropriately. Then,

pip install hypergrad

Methods

Implicit hypergradient approximation (via approximated inverse Hessian-vector product)

Implementation of these methods can be found in hypergrad/approximate_ihvp.py

Citation

To cite this repository,

@software{hypergrad,
    author = {Ryuichiro Hataya},
    title = {{hypergrad}},
    url = {https://github.com/moskomule/hypergrad},
    year = {2023}
}

hypergrad is developed as a part of the following research projects:

@inproceedings{hataya2023nystrom,
    author = {Ryuichiro Hataya and Makoto Yamada},
    title = {{Nystr\"om Method for Accurate and Scalable Implicit Differentiation}},
    booktitle = {AISTATS},
    year = {2023},
}