bayes_by_backprop {BayesFluxR}R Documentation

Use Bayes By Backprop to find Variational Approximation to BNN.

Description

This was proposed in Blundell, C., Cornebise, J., Kavukcuoglu, K., & Wierstra, D. (2015, June). Weight uncertainty in neural network. In International conference on machine learning (pp. 1613-1622). PMLR.

Usage

bayes_by_backprop(
  bnn,
  batchsize,
  epochs,
  mc_samples = 1,
  opt = opt.ADAM(),
  n_samples_convergence = 10
)

Arguments

bnn

a BNN obtained using BNN

batchsize

batch size

epochs

number of epochs to run for

mc_samples

samples to use in each iteration for the MC approximation usually one is enough.

opt

An optimiser. These all start with 'opt.'. See for example opt.ADAM

n_samples_convergence

At the end of each iteration convergence is checked using this many MC samples.

Value

a list containing

Examples

## Not run: 
  ## Needs previous call to `BayesFluxR_setup` which is time
  ## consuming and requires Julia and BayesFlux.jl
  BayesFluxR_setup(installJulia=TRUE, seed=123)
  net <- Chain(RNN(5, 1))
  like <- likelihood.seqtoone_normal(net, Gamma(2.0, 0.5))
  prior <- prior.gaussian(net, 0.5)
  init <- initialise.allsame(Normal(0, 0.5), like, prior)
  data <- matrix(rnorm(10*1000), ncol = 10)
  # Choosing sequences of length 10 and predicting one period ahead
  tensor <- tensor_embed_mat(data, 10+1)
  x <- tensor[1:10, , , drop = FALSE]
  # Last value in each sequence is the target value
  y <- tensor[11,,]
  bnn <- BNN(x, y, like, prior, init)
  vi <- bayes_by_backprop(bnn, 100, 100)
  vi_samples <- vi.get_samples(vi, n = 1000)

## End(Not run)


[Package BayesFluxR version 0.1.3 Index]