rjmcmcpost {rjmcmc} | R Documentation |
Perform Reversible-Jump MCMC Post-Processing
Description
Performs Bayesian multimodel inference, estimating Bayes factors and posterior model probabilities for N candidate models. Using the 'universal parameter' restriction in Barker & Link (2013), RJMCMC is treated as a Gibbs sampling problem, where the algorithm alternates between updating the model and the model specific parameters. Transformation Jacobians are computed using automatic differentiation so do not need to be specified.
Usage
rjmcmcpost(post.draw, g, ginv, likelihood, param.prior, model.prior,
chainlength = 10000, TM.thin = chainlength/10, save.all = FALSE,
progress = TRUE)
Arguments
post.draw |
A list of N functions that randomly draw from the posterior distribution under each model. Generally these functions sample from the output of a model fitted using MCMC. |
g |
A list of N functions specifying the bijections from the universal
parameter |
ginv |
A list of N functions specifying the bijections from each
model-specific parameter set to |
likelihood |
A list of N functions specifying the log-likelihood functions for the data under each model. |
param.prior |
A list of N functions specifying the log-prior distributions for each model-specific parameter vector. |
model.prior |
A numeric vector of the prior model probabilities. Note that this argument is not required to sum to one as it is automatically normalised. |
chainlength |
How many iterations to run the Markov chain for. |
TM.thin |
How regularly to calculate transition matrices as the chain progresses. |
save.all |
A logical determining whether to save the value of the
universal parameter at each iteration, as well as the corresponding
likelihoods, priors and posteriors. If |
progress |
A logical determining whether a progress bar is drawn. |
Value
Returns an object of class rj
(see rjmethods
).
If save.all=TRUE
, the output has named elements result
,
densities
, psidraws
, progress
and meta
. If
save.all=FALSE
, the densities
and psidraws
elements
are omitted.
result
contains useful point estimates, progress
contains
snapshots of these estimates over time, and meta
contains
information about the function call.
References
Barker, R. J. and Link, W. A. (2013) Bayesian multimodel inference by RJMCMC: A Gibbs sampling approach. The American Statistician, 67(3), 150-156.
See Also
Examples
## Comparing two binomial models -- see Barker & Link (2013) for further details.
y=c(8,16); sumy=sum(y)
n=c(20,30); sumn=sum(n)
L1=function(p){if((all(p>=0))&&(all(p<=1))) sum(dbinom(y,n,p,log=TRUE)) else -Inf}
L2=function(p){if((p[1]>=0)&&(p[1]<=1)) sum(dbinom(y,n,p[1],log=TRUE)) else -Inf}
g1=function(psi){p=psi}
g2=function(psi){w=n[1]/sum(n); p=c(w*psi[1]+(1-w)*psi[2],psi[2])}
ginv1=function(p){p}
ginv2=function(p){c(sum(n)/n[1]*p[1]-n[2]/n[1]*p[2],p[2])}
p.prior1=function(p){sum(dbeta(p,1,1,log=TRUE))}
p.prior2=function(p){dbeta(p[1],1,1,log=TRUE)+dbeta(p[2],17,15,log=TRUE)}
draw1=function(){rbeta(2,y+1,n-y+1)}
draw2=function(){c(rbeta(1,sumy+1,sumn-sumy+1),rbeta(1,17,15))}
out=rjmcmcpost(post.draw=list(draw1,draw2), g=list(g1,g2), ginv=list(ginv1,ginv2),
likelihood=list(L1,L2), param.prior=list(p.prior1,p.prior2),
model.prior=c(0.5,0.5), chainlength=1500)