renom.layers.loss

class renom.layers.loss.mean_squared_error. MeanSquaredError

This function evaluates the loss between the target y and the output x using mean squared error.

E(x) = \frac{1}{2N}\sum_{n}^{N}\sum_{k}^{K}(x_{nk}-y_{nk})^2

N is batch size.

Parameters:
  • x ( ndarray,Node ) – Input array.
  • y ( ndarray,Node ) – Target array.
Raises:

AssertionError – An assertion error will be raised if the given tensor dimension is less than 2.

Example

>>> import renom as rm
>>> import numpy as np
>>>
>>> x = np.array([[1, 1]])
>>> y = np.array([[-1, -1]])
>>> print(x.shape, y.shape)
((1, 2), (1, 2))
>>> loss = rm.mean_squared_error(x, y)
>>> print(loss)
mean_squared_error(4.0)
class renom.layers.loss.clipped_mean_squared_error. ClippedMeanSquaredError ( clip=1.0 )

Cliped mean squared error function. In the forward propagation, this function yields same calculation as mean squared error.

In the backward propagation, this function calculates following formula.

\frac{dE}{dx}_{clipped} = max(min(\frac{dE}{dx}, clip), -clip)
Parameters:
  • x ( ndarray,Node ) – Input data.
  • y ( ndarray,Node ) – Target data.
  • clip ( float,tuple ) – Clipping threshold.
Raises:

AssertionError – An assertion error will be raised if the given tensor dimension is less than 2.

class renom.layers.loss.sigmoid_cross_entropy. SigmoidCrossEntropy

This function evaluates the loss between target y and output of sigmoid activation z using cross entropy.

\begin{split}z_{nk} &= \frac{1}{1 + \exp(-x_{nk})} \\ E(x) &= -\frac{1}{N}\sum_{n}^{N}\sum_{k}^{K}y_{nk}\log(z_{nk})+(1-y_{nk})\log(1-z_{nk})\end{split}
Parameters:
  • x ( ndarray,Node ) – Input array.
  • y ( ndarray,Node ) – Target array.
Raises:

AssertionError – An assertion error will be raised if the given tensor dimension is less than 2.

class renom.layers.loss.softmax_cross_entropy. SoftmaxCrossEntropy

This function evaluates the loss between target y and output of softmax activation z using cross entropy.

\begin{split}z_{nk} &= \frac{\exp(x_{nk})}{\sum_{j=1}^{K}\exp(x_{nj})} \\ E(x) &= -\frac{1}{N}\sum_{n}^{N}\sum_{k}^{K}y_{nk}\log(z_{nk})\end{split}
Parameters:
  • x ( ndarray,Node ) – Input array.
  • y ( ndarray,Node ) – Target array.
Raises:

AssertionError – An assertion error will be raised if the given tensor dimension is less than 2.