renom.utility package

class renom.utility.initializer. Initializer

Bases: object

Base class of initializer.

When the initialization of parameterized layer class, dense, conv2d, lstm ... , you can select the initialization method changing the initializer class as following example.

Example

>>> import renom as rm
>>> from renom.utility.initializer import GlorotUniform
>>>
>>> layer = rm.Dense(output_size=2, input_size=2, initializer=GlorotUniform())
>>> print(layer.params.w)
[[-0.55490332 -0.14323548]
 [ 0.00059367 -0.28777076]]
class renom.utility.initializer. GlorotUniform

Bases: renom.utility.initializer.Initializer

Glorot uniform initializer [1]_ initializes parameters sampled by following uniform distribution “U(max, min)”.

\begin{split}&U(max, min) \\ &max = sqrt(6/(input\_size + output\_size)) \\ &min = -sqrt(6/(input\_size + output\_size))\end{split}
class renom.utility.initializer. GlorotNormal

Bases: renom.utility.initializer.Initializer

Glorot normal initializer [1]_ initializes parameters sampled by following normal distribution “N(0, std)”.

\begin{split}&N(0, std) \\ &std = sqrt(2/(input\_size + output\_size)) \\\end{split}
[1] Xavier Glorot, Yoshua Bengio. Understanding the difficulty of training deep feedforward neural networks.
class renom.utility.initializer. Gaussian ( mean=0.0 , std=0.1 )

Bases: renom.utility.initializer.Initializer

Gaussian initializer. Initialize parameters using samples drawn from N(mean, std)

Parameters:
  • mean ( float ) – Mean value of normal distribution.
  • std ( float ) – Standard deviation value of normal distribution.
class renom.utility.initializer. Uniform ( min=-1.0 , max=1.0 )

Bases: renom.utility.initializer.Initializer

Uniform initializer. Initialize parameters using samples drawn from U(min, max)

Parameters:
  • min ( float ) – Minimum limit of uniform distribution.
  • max ( float ) – Maximum limit of uniform distribution.
class renom.utility.searcher. Searcher ( parameters )

Bases: object

Base class of searcher.

Searcher classes searches the hyper parameter that yields the lowest value.

Parameters: parameters ( dict ) – Dictionary which contains the parameter name as a key and each parameter space as a value.

Example

>>> import renom as rm
>>> from renom.utility.searchera import GridSearcher
>>> params = {
...     "p1":[1, 2, 3],
...     "p2":[4, 5, 6],
... }
...
>>> searcher = GridSearcher(params)
>>>
>>> for p in searcher.suggest():
...     searcher.set_result(p["p1"] + p["p2"])
...
>>> bests = searcher.best()
>>> for i in range(len(bests)):
... print("{}: parameter {} value {}".format(i+1, bests[i][0], bests[i][1]))
...
1: parameter {'p2': 4, 'p1': 1} value 5
2: parameter {'p2': 4, 'p1': 2} value 6
3: parameter {'p2': 5, 'p1': 1} value 6
set_result ( result , params=None )

Set the result of yielded hyper parameter to searcher object.

Parameters:
  • result ( float ) – The result of yielded hyper parameter.
  • params ( dict ) – The hyper parameter which used in model. If None has given, the result is considered as it caused by last yielded hyper parameter.
suggest ( max_iter )

This method yields next hyper parameter.

Parameters: max_iter ( int ) – Maximum iteration number of parameter search.
Yields: dict – Dictionary of hyper parameter.
best ( num=3 )

Returns the best hyper parameters. By default, this method returns the top 3 hyper parameter as a result of searching.

Parameters: num ( int ) – The number of hyper parameters.
Returns: A list of dictionary of hyper parameters.
Return type: list
class renom.utility.searcher. GridSearcher ( parameters )

Bases: renom.utility.searcher.Searcher

Grid searcher class.

This class searches better hyper parameter in the parameter space with grid search.

Parameters: parameters ( dict ) – Dictionary witch contains the parameter name as a key and each parameter space as a value.
class renom.utility.searcher. RandomSearcher ( parameters )

Bases: renom.utility.searcher.Searcher

Random searcher class.

This class randomly searches a parameter of the model which yields the lowest loss.

Parameters: parameters ( dict ) – Dictionary which contains the parameter name as a key and each parameter space as a value.
class renom.utility.searcher. BayesSearcher ( parameters )

Bases: renom.utility.searcher.Searcher

Bayes searcher class.

This class performs hyper parameter search based on bayesian optimization.

Parameters: parameters ( dict ) – Dictionary which contains the parameter name as a key and each parameter space as a value.

Note

This class requires the module GPy [1]_ . You can install it using pip. pip install gpy

[1] GPy - Gaussian Process framework http://sheffieldml.github.io/GPy/
suggest ( max_iter=10 , random_iter=3 )
Parameters:
  • max_iter ( int ) – Maximum iteration number of parameter search.
  • random_iter ( int ) – Number of random search.
class renom.utility.trainer. Trainer ( model , num_epoch , loss_func , batch_size , optimizer=None , shuffle=True , events=None , num_gpu=1 )

Bases: object

Trainer class.

This class owns train loop. It executes forward propagation, back propagation and updating of weight parameters for the specified number of times.

Parameters:
  • model ( Model ) – Model to be trained.
  • num_epoch ( int ) – Numer of iteration.
  • loss_func ( Node ) – Loss function.
  • batch_size ( int ) – Batch size.
  • optimizer ( Optimizer ) – Gradient descent algorithm.
  • shuffle ( bool ) – If it’s true, mini batch is created randomly.
  • events ( dict ) – Dictionary of function.

Example

>>> import numpy as np
>>> import renom as rm
>>> from renom.utility.trainer import Trainer
>>> from renom.utility.distributor import NdarrayDistributor
>>> x = np.random.rand(300, 50)
>>> y = np.random.rand(300, 1)
>>> model = rm.Dense(1)
>>> trainer = Trainer(model, 10, rm.mean_squared_error, 3, rm.Sgd(0.1))
>>> trainer.train(NdarrayDistributor(x, y))
epoch  0: avg loss 0.1597: 100%|██████████| 100/100.0 [00:00<00:00, 1167.85it/s]
epoch  1: avg loss 0.1131: 100%|██████████| 100/100.0 [00:00<00:00, 1439.25it/s]
epoch  2: avg loss 0.1053: 100%|██████████| 100/100.0 [00:00<00:00, 1413.42it/s]
epoch  3: avg loss 0.0965: 100%|██████████| 100/100.0 [00:00<00:00, 1388.67it/s]
epoch  4: avg loss 0.0812: 100%|██████████| 100/100.0 [00:00<00:00, 1445.61it/s]
epoch  5: avg loss 0.0937: 100%|██████████| 100/100.0 [00:00<00:00, 1432.99it/s]
epoch  6: avg loss 0.0891: 100%|██████████| 100/100.0 [00:00<00:00, 1454.68it/s]
epoch  7: avg loss 0.0992: 100%|██████████| 100/100.0 [00:00<00:00, 1405.73it/s]
epoch  8: avg loss 0.0933: 100%|██████████| 100/100.0 [00:00<00:00, 1401.55it/s]
epoch  9: avg loss 0.1090: 100%|██████████| 100/100.0 [00:00<00:00, 1343.97it/s]
train ( train_distributor , test_distributor=None )

Train method. This method executes train loop. If test_distributor is given, validation loss will be calculated.

Parameters:
  • train_distributor ( Distributor ) – Distributor for yielding train data.
  • test_distributor ( Distributor ) – Distributor for yielding test data.
test ( data )

Test method. This method executes forward propagation for given data.

Parameters: data ( ndarray ) – Input data.
Returns: ndarray