softmax {deepNN} | R Documentation |
softmax function
Description
A function to evaluate the softmax activation function, the derivative and cost derivative to be used in defining a neural network. Note that at present, this unit can only be used as an output unit.
Usage
softmax()
Value
a list of functions used to compute the activation function, the derivative and cost derivative.
References
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)
Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)
Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
http://neuralnetworksanddeeplearning.com/
See Also
network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident
Examples
# Example in context
net <- network( dims = c(100,50,20,2),
activ=list(logistic(),ReLU(),softmax()))
[Package deepNN version 1.2 Index]