Abhilash1-optimizers

Stochastic Optimizers for gradient convergence including activation functions


Keywords
adam, adagrad, adamax, optimizers, stochastic, sgd, rmsprop, momentum, nesterov
License
MIT
Install
pip install Abhilash1-optimizers==0.1