R&D tools
pip install rndtools==3.0.7
##Installation pip install git+https://git.identt.pl/identt/rnd-tools.git#egg=rndtools
##What is it?
When you call train_model
function then this framework will do few useful things:
atrchitecture.json
,get_model_function
and training_function
,Implement load_data
function.
Implement function that returns compiled keras model. Function should not have any parameters. Example:
def get_model():
model = Sequential()
model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))
model.compile(
optimizer=Adam(),
loss='binary_crossentropy',
metrics=['accuracy']
)
return model
Implement function that trains model. Function should return model history. Example:
def train(data, model, model_folder, callbacks=None):
if callbacks is None:
callbacks = []
history = model.fit(data.X, data.Y, nb_epoch=150, batch_size=10, callbacks=callbacks)
return history
Pay attention to callbacks parameter. There are some extra callbacks that you should add to model callbacks.
Also note that as in data
parameter function pass what load_data
function returns.
Example:
>>> import rndtools as rnd
>>> rnd.train.train_model(
model_dir,
get_model_function=get_model,
training_function=train,
loading_data_function=load_data
)
Model path: /home/rd/notebooks/documents-detector/damian/models/in_the_wild/unet_mrz/7
------------------------------
Creating dirs...
------------------------------
------------------------------
Creating and compiling model...
------------------------------
------------------------------
Saving architecture...
------------------------------
------------------------------
Plotting model...
------------------------------
------------------------------
Saving model source code...
------------------------------
------------------------------
Loading data...
------------------------------
------------------------------
Instantiating callbacks...
------------------------------
------------------------------
Training model...
------------------------------
Epoch 1/1000
Finished!
Sometimes there is so many data that it is problem to store it in memory. Then you can use divide your dataset into parts DatasetInPartsGenerator
that will load this parts in turn, so you will have only part of dataset in memory.