GWU_NN
latest
Contents:
Usage
GWU_Network
Layers
Activation Layers
Activation Functions
Loss Functions
GWU_NN
»
Index
Edit on GitLab
Index
A
|
B
|
C
|
D
|
F
|
G
|
I
|
L
|
M
|
P
|
R
|
S
A
activation() (gwu_nn.activation_functions.SoftmaxActivation class method)
activation_partial_derivative() (gwu_nn.activation_functions.SoftmaxActivation class method)
ActivationFunction (class in gwu_nn.activation_functions)
ActivationLayer (class in gwu_nn.activation_layers)
add() (gwu_nn.gwu_network.GWUNetwork method)
(in module gwu_nn.gwu_network.GWUNetwork)
B
backward_propagation() (gwu_nn.activation_layers.ActivationLayer method)
C
compile() (gwu_nn.gwu_network.GWUNetwork method)
CrossEntropy (class in gwu_nn.loss_functions)
D
Dense (class in gwu_nn.layers)
F
fit() (gwu_nn.gwu_network.GWUNetwork method)
forward_propagation() (gwu_nn.activation_layers.ActivationLayer method)
G
get_weights() (gwu_nn.gwu_network.GWUNetwork method)
GWUNetwork (class in gwu_nn.gwu_network)
,
[1]
I
init_weights() (gwu_nn.layers.Dense method)
L
Layer (class in gwu_nn.layers)
,
[1]
LogLoss (class in gwu_nn.loss_functions)
loss() (gwu_nn.loss_functions.CrossEntropy class method)
(gwu_nn.loss_functions.LogLoss class method)
(gwu_nn.loss_functions.LossFunction method)
(gwu_nn.loss_functions.MSE class method)
loss_partial_derivative() (gwu_nn.loss_functions.CrossEntropy class method)
(gwu_nn.loss_functions.LogLoss class method)
(gwu_nn.loss_functions.LossFunction method)
(gwu_nn.loss_functions.MSE class method)
LossFunction (class in gwu_nn.loss_functions)
M
MSE (class in gwu_nn.loss_functions)
P
predict() (gwu_nn.gwu_network.GWUNetwork method)
R
RELU (class in gwu_nn.activation_layers)
RELUActivation (class in gwu_nn.activation_functions)
S
Sigmoid (class in gwu_nn.activation_layers)
SigmoidActivation (class in gwu_nn.activation_functions)
Softmax (class in gwu_nn.activation_layers)
SoftmaxActivation (class in gwu_nn.activation_functions)
Read the Docs
v: latest
Versions
latest
stable
Downloads
On Read the Docs
Project Home
Builds