Activation Layers

class gwu_nn.activation_layers.ActivationLayer(activation)

The ActivationLayer class acts as a connector between a layer and an activation function. It ensures that the activation function is used correctly during forward and backward propogation.

backward_propagation(output_error, learning_rate)

Applies the classes activation function to the provided input

Args:

output_error (np.array): output_error calculated backwards to this layer

Returns:

np.array(float): backwards pass (output_error) up to this layer

forward_propagation(input)

Applies the classes activation function to the provided input

Args:

input (np.array): output calculated forward up to this layer

Returns:

np.array(float): forward pass (output) up to this layer

class gwu_nn.activation_layers.Sigmoid

Layer that applies the Sigmoid activation function. Inheirits forward_propagation and backward_prop from ActivationLayer

class gwu_nn.activation_layers.RELU

Layer that applies the ReLU activation function. Inheirits forward_propagation and backward_prop from ActivationLayer

class gwu_nn.activation_layers.Softmax

Layer that applies the Softmax activation function. Inheirits forward_propagation and backward_prop from ActivationLayer