inst {nnR}R Documentation

inst

Description

The function that instantiates a neural network as created by create_nn().

Usage

inst(neural_network, activation_function, x)

Arguments

neural_network

An ordered list of lists, of the type generated by create_nn() where each element in the list of lists is a pair (W,b)(W,b) representing the weights and biases of that layer.

NOTE: We will call istantiation what Grohs et. al. call "realization".

activation_function

A continuous function applied to the output of each layer. For now we only have ReLU, Sigmoid, and Tanh. Note, all proofs are only valid for ReLU activation.

x

our input to the continuous function formed from activation. Our input will be an element in Rd\mathbb{R}^d for some appropriate dd.

Value

The output of the continuous function that is the instantiation of the given neural network with the given activation function at the given xx. Where xx is of vector size equal to the input layer of the neural network.

References

Grohs, P., Hornung, F., Jentzen, A. et al. Space-time error estimates for deep neural network approximations for differential equations. (2019). https://arxiv.org/abs/1908.03833.

Definition 1.3.4. Jentzen, A., Kuckuck, B., and von Wurstemberger, P. (2023). Mathematical introduction to deep learning: Methods, implementations, and theory. https://arxiv.org/abs/2310.20360

Very precisely we will use the definition in:

Definition 2.3 in Rafi S., Padgett, J.L., Nakarmi, U. (2024) Towards an Algebraic Framework For Approximating Functions Using Neural Network Polynomials https://arxiv.org/abs/2402.01058

Examples

create_nn(c(1, 3, 5, 6)) |> inst(ReLU, 5)
create_nn(c(3, 3, 5, 6)) |> inst(ReLU, c(4, 4, 4))


[Package nnR version 0.1.0 Index]