NNgrad_test {deepNN} | R Documentation |
NNgrad_test function
Description
A function to test gradient evaluation of a neural network by comparing it with central finite differencing.
Usage
NNgrad_test(net, loss = Qloss(), eps = 1e-05)
Arguments
net |
an object of class network, see ?network |
loss |
a loss function to compute, see ?Qloss, ?multinomial |
eps |
small value used in the computation of the finite differencing. Default value is 0.00001 |
Value
the exact (computed via backpropagation) and approximate (via central finite differencing) gradients and also a plot of one against the other.
References
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)
Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)
Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
http://neuralnetworksanddeeplearning.com/
See Also
network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation
Examples
net <- network( dims = c(5,10,2),
activ=list(ReLU(),softmax()))
NNgrad_test(net)