Hype

Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization. This is enabled by nested automatic differentiation (AD) giving you access to the automatic exact derivative of any floating-point value in your code with respect to any other. Underlying computations are run by a BLAS/LAPACK backend (OpenBLAS by default).


Keywords
Deep, Learning, Machine, Optimization, Neural, Networks
License
MIT
Install
Install-Package Hype -Version 0.1.3

Documentation

Hype: Compositional Machine Learning and Hyperparameter Optimization

Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization.

It is developed by Atılım Güneş Baydin and Barak A. Pearlmutter, at the Brain and Computation Lab, National University of Ireland Maynooth.

This work is supported by Science Foundation Ireland grant 09/IN.1/I2637.

Please visit the project website for documentation and tutorials.

You can come and join the Gitter chat room, if you want to chat with us:

Join the chat at https://gitter.im/hypelib/Hype

Project statistics

Issue Stats Issue Stats

Current build status

Build status

License

Hype is released under the MIT license.