MLP_net {deepNN} | R Documentation |
MLP_net function
Description
A function to define a multilayer perceptron and compute quantities for backpropagation, if needed.
Usage
MLP_net(input, weights, bias, dims, nlayers, activ, back = TRUE, regulariser)
Arguments
input |
input data, a list of vectors (i.e. ragged array) |
weights |
a list object containing weights for the forward pass, see ?weights2list |
bias |
a list object containing biases for the forward pass, see ?bias2list |
dims |
the dimensions of the network as stored from a call to the function network, see ?network |
nlayers |
number of layers as stored from a call to the function network, see ?network |
activ |
list of activation functions as stored from a call to the function network, see ?network |
back |
logical, whether to compute quantities for backpropagation (set to FALSE for feed-forward use only) |
regulariser |
type of regularisation strategy to, see ?train, ?no_regularisation ?L1_regularisation, ?L2_regularisation |
Value
a list object containing the evaluated forward pass and also, if selected, quantities for backpropagation.
References
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)
Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)
Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
http://neuralnetworksanddeeplearning.com/
See Also
network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation