renom.layers.activation package

class renom.layers.activation.elu. Elu ( alpha=0.01 )

Bases: object

The Exponential Linear Units [1] activation function is described by the following formula:

f(x)=max(x, 0) + alpha*min(exp(x)-1, 0)
Parameters:
  • x ( ndarray, Variable ) – Input numpy array or instance of Variable.
  • alpha ( float ) – Coefficient multiplied by exponentiated values.

Example

>>> import renom as rm
>>> import numpy as np
>>> x = np.array([[1, -1]])
array([[ 1, -1]])
>>> rm.elu(x)
elu([[ 1.  , -0.00632121]])
>>> # instantiation
>>> activation = rm.Elu()
>>> activation(x)
elu([[ 1.  , -0.00632121]])
[1] Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015). Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs). Published as a conference paper at ICLR 2016
class renom.layers.activation.leaky_relu. LeakyRelu ( slope=0.01 )

Bases: object

The Leaky relu [2] activation function is described by the following formula:

f(x)=max(x, 0)+min(slope*x, 0)
Parameters:
  • x ( ndarray, Variable ) – Input numpy array or instance of Variable.
  • slope ( float ) – Coefficient multiplied by negative values.

Example

>>> import renom as rm
>>> import numpy as np
>>> x = np.array([[1, -1]])
array([[ 1, -1]])
>>> rm.leaky_relu(x, slope=0.01)
leaky_relu([[ 1.  , -0.01]])
>>> # instantiation
>>> activation = rm.LeakyRelu(slope=0.01)
>>> activation(x)
leaky_relu([[ 1.  , -0.01]])
[2] Andrew L. Maas, Awni Y. Hannun, Andrew Y. Ng (2014). Rectifier Nonlinearities Improve Neural Network Acoustic Models
class renom.layers.activation.relu. Relu

Bases: object

Rectified Linear Unit activation function as described by the following formula.

f(x)=max(x, 0)
Parameters: x ( ndarray, Node ) – Input numpy array or Node instance.

Example

>>> import renom as rm
>>> import numpy as np
>>> x = np.array([[1, -1]])
array([[ 1, -1]])
>>> rm.relu(x)
relu([[ 1.  , 0.]])
>>> # instantiation
>>> activation = rm.Relu()
>>> activation(x)
relu([[ 1.  , 0]])
class renom.layers.activation.selu. Selu

Bases: object

The scaled exponential linear unit [3] activation function is described by the following formula:

a = 1.6732632423543772848170429916717 b = 1.0507009873554804934193349852946 f(x) = b*max(x, 0)+min(0, exp(x) - a)
Parameters: x ( ndarray, Node ) – Input numpy array or Node instance.

Example

>>> import renom as rm
>>> import numpy as np
>>> x = np.array([[1, -1]])
array([[ 1, -1]])
>>> rm.relu(x)
selu([ 1.05070102, -1.11133075])
>>> # instantiation
>>> activation = rm.Relu()
>>> activation(x)
selu([ 1.05070102, -1.11133075])
[3] Günter Klambauer, Thomas Unterthiner, Andreas Mayr, Sepp Hochreiter. Self-Normalizing Neural Networks. Learning (cs.LG); Machine Learning
class renom.layers.activation.sigmoid. Sigmoid

Bases: object

Sigmoid activation function as described by the following formula.

f(x) = 1/(1 + \exp(-x))
Parameters: x ( ndarray, Node ) – Input numpy array or Node instance.

Example

>>> import numpy as np
>>> import renom as rm
>>> x = np.array([1., -1.])
>>> rm.sigmoid(x)
sigmoid([ 0.7310586 ,  0.26894143])
>>> # instantiation
>>> activation = rm.Sigmoid()
>>> activation(x)
sigmoid([ 0.7310586 ,  0.26894143])
class renom.layers.activation.softmax. Softmax

Bases: object

Soft max activation function is described by the following formula:

f(x_j)=\frac{x_j}{\sum_{i}exp(x_i)}
Parameters: x ( ndarray, Variable ) – Input numpy array or instance of Variable.

Example

>>> import renom as rm
>>> import numpy as np
>>> x = np.random.rand(1, 3)
array([[ 0.11871966  0.48498547  0.7406374 ]])
>>> z = rm.softmax(x)
softmax([[ 0.23229694  0.33505085  0.43265226]])
>>> np.sum(z, axis=1)
array([ 1.])
class renom.layers.activation.tanh. Tanh

Bases: object

Hyperbolic tangent activation function as described by the following formula.

f(x) = tanh(x)
Parameters: x ( ndarray, Node ) – Input numpy array or Node instance.

Example

>>> import numpy as np
>>> import renom as rm
>>> x = np.array([1., -1.])
>>> rm.tanh(x)
tanh([ 0.76159418, -0.76159418])
>>> # instantiation
>>> activation = rm.Tanh()
>>> activation(x)
tanh([ 0.76159418, -0.76159418])