Saving and Loading Models

An introduction of saving and loading learned models.

Requirements

In this tutorial, following modules are required.

  • Python 3.4
  • Numpy 1.13.1
  • Matplolib 2.0.2
  • h5py 2.7.0
  • ReNom 2.0.0
In [1]:
import numpy as np
import matplotlib.pyplot as plt
import renom as rm
from renom.utility.trainer import Trainer
from renom.utility.distributor import NdarrayDistributor

Prepare data

First, we prepare a dataset. We define a population distribution, and sample data from it.

In [2]:
# Data population distribution
def population(x):
    return np.sin(x*np.pi*2) + np.random.randn(*x.shape)*0.1

x = np.random.rand(1000, 1)
y = population(x)

dist = NdarrayDistributor(x, y)

# Split distributor into train_dist and test_dist by the ratio of 9:1.
train_dist, test_dist = dist.split(0.9)

# Plot dataset
plt.scatter(*train_dist.data(), label="train data")
plt.scatter(*test_dist.data(), label="test data")
plt.legend()
plt.grid()
plt.title("Population of dataset")
plt.ylabel("y")
plt.xlabel("x")
plt.show()
../../../_images/notebooks_basic_save-model_notebook_4_0.png

Model definition

Then we define a simple 2 layered neural network model using sequential model. Both the input size and the output size are 1.

In [3]:
model = rm.Sequential([
    rm.Dense(3),
    rm.Relu(),
    rm.Dense(1),
])

Train the model using trainer function

We train the model using trainer function. The usage of trainer function is introduced in “Tutorial Trainer”.

In [4]:
trainer = Trainer(model,
                  batch_size=64,
                  loss_func=rm.mean_squared_error,
                  num_epoch=10,
                  optimizer=rm.Adam())
trainer.train(train_dist, test_dist)
epoch  0: avg loss 0.2547: avg test loss 0.2742: 15it [00:00, 777.08it/s]
epoch  1: avg loss 0.2584: avg test loss 0.2600: 15it [00:00, 758.43it/s]
epoch  2: avg loss 0.2480: avg test loss 0.2651: 15it [00:00, 784.19it/s]
epoch  3: avg loss 0.2661: avg test loss 0.2618: 15it [00:00, 845.00it/s]
epoch  4: avg loss 0.2553: avg test loss 0.2665: 15it [00:00, 813.26it/s]
epoch  5: avg loss 0.2531: avg test loss 0.2576: 15it [00:00, 826.61it/s]
epoch  6: avg loss 0.2609: avg test loss 0.2733: 15it [00:00, 826.14it/s]
epoch  7: avg loss 0.2565: avg test loss 0.2626: 15it [00:00, 874.02it/s]
epoch  8: avg loss 0.2554: avg test loss 0.2594: 15it [00:00, 798.51it/s]
epoch  9: avg loss 0.2526: avg test loss 0.2592: 15it [00:00, 756.71it/s]

Save weight parameters of model

Here we save the weight parameters of the learned model. For saving them, call the function “save” of the model object. The method save requires path to the save file. The format of the saved file is hdf5 . So this method requires the module h5py.

In [5]:
# Check the weight parameters
print(model.l0.params)

# Save the weight parameters
model.save("model.h5")
{'b': Variable([[ 0.,  0.,  0.]]), 'w': Variable([[-0.08574371, -0.252234  , -0.11439074]])}

Reset Model Parameters

After saving the weight parameters, once we reset them.

In [6]:
for layer in model:
    setattr(layer, "params", {})

# Confirm the weights are reseted
print(model.l0.params)
{}

Load weight parameters to model

Then we load and set the parameters from the file.

In [7]:
# Load and set the weight parameters.
model.load("model.h5")

print(model.l0.params)
{'b': Variable([[ 0.,  0.,  0.]]), 'w': Variable([[-0.08574371, -0.252234  , -0.11439074]])}