keras-adabound

AdaBound optimizer in Keras


Keywords
adabound, keras, optimizer
License
MIT
Install
pip install keras-adabound==0.6.0

Documentation

Keras AdaBound

Travis Coverage

AdaBound optimizer in Keras.

Install

pip install keras-adabound

Usage

Use the optimizer

from keras_adabound import AdaBound

model.compile(optimizer=AdaBound(lr=1e-3, final_lr=0.1), loss=model_loss)

Load with custom objects

from keras_adabound import AdaBound

model = keras.models.load_model(model_path, custom_objects={'AdaBound': AdaBound})

About weight decay

The optimizer does not have an argument named weight_decay (as in the official repo) since it can be done by adding L2 regularizers to weights:

import keras

regularizer = keras.regularizers.l2(WEIGHT_DECAY / 2)
for layer in model.layers:
    for attr in ['kernel_regularizer', 'bias_regularizer']:
        if hasattr(layer, attr) and layer.trainable:
            setattr(layer, attr, regularizer)