Activation Functions
- class gwu_nn.activation_functions.ActivationFunction
Abstract class that defines base functionality for activation functions
- class gwu_nn.activation_functions.SigmoidActivation
Implements the sigmoid activation function typically used for logistic regression
- class gwu_nn.activation_functions.SoftmaxActivation
- classmethod activation(x)
Applies the softmax function to the input array
- Args:
x (np.array): input into the layer/activation function
- Returns:
np.array(floats): Softmax(x)
- classmethod activation_partial_derivative(x)
Applies the partial derivative of the sigmoid function
- Args:
x (np.array): partial derivative up to this layer/activation function
- Returns:
np.array(floats): derivative of network up to this activation/layer
- class gwu_nn.activation_functions.RELUActivation