fcnntrain
Train a fully connected Neural Network.
Mdl = fcnntrain (…)
requires the following input arguments.
0
: 'Linear'
1
: 'Sigmoid'
2
: 'Rectified Linear Unit (ReLU)'
3
: 'Hyperbolic tangent (tanh)'
4
: 'Softmax'
5
: 'Parametric or Leaky ReLU'
6
: 'Exponential Linear Unit (ELU)'
7
: 'Gaussian Error Linear Unit (GELU)'
alpha
used in ReLU
and ELU
activation layers.
fcnntrain
returns the trained model, Mdl, as a structure containing the following fields:
LayerWeights
: A cell array with each element containing a matrix with the Weights and Biases of each layer including the output layer.
Activations
: A numeric row vector of integer values defining the activation functions to be used at each layer including the output layer.
Accuracy
: The prediction accuracy at each iteration during the neural network model’s training process.
Loss
: The loss value recorded at each iteration during the neural network model’s training process.
Alpha
: The value of the Alpha parameter used in ReLU
and ELU
activation layers.
Installation Note: in order to support parallel processing on MacOS, users have to manually add support for OpenMP by adding the following flags to CFLAGS
and CXXFLAGS
prior to installing the statistics package:
setenv ("CPPFLAGS", "-I/opt/homebrew/opt/libomp/include -Xclang -fopenmp")
See also: fcnnpredict, fitcnet, ClassificationNeuralNetwork
Source Code: fcnntrain