MCMCregress {MCMCpack} | R Documentation |
Markov Chain Monte Carlo for Gaussian Linear Regression
Description
This function generates a sample from the posterior distribution of a linear regression model with Gaussian errors using Gibbs sampling (with a multivariate Gaussian prior on the beta vector, and an inverse Gamma prior on the conditional error variance). The user supplies data and priors, and a sample from the posterior distribution is returned as an mcmc object, which can be subsequently analyzed with functions provided in the coda package.
Usage
MCMCregress(
formula,
data = NULL,
burnin = 1000,
mcmc = 10000,
thin = 1,
verbose = 0,
seed = NA,
beta.start = NA,
b0 = 0,
B0 = 0,
c0 = 0.001,
d0 = 0.001,
sigma.mu = NA,
sigma.var = NA,
marginal.likelihood = c("none", "Laplace", "Chib95"),
...
)
Arguments
formula |
Model formula. |
data |
Data frame. |
burnin |
The number of burn-in iterations for the sampler. |
mcmc |
The number of MCMC iterations after burnin. |
thin |
The thinning interval used in the simulation. The number of MCMC iterations must be divisible by this value. |
verbose |
A switch which determines whether or not the progress of the
sampler is printed to the screen. If |
seed |
The seed for the random number generator. If NA, the Mersenne
Twister generator is used with default seed 12345; if an integer is passed
it is used to seed the Mersenne twister. The user can also pass a list of
length two to use the L'Ecuyer random number generator, which is suitable
for parallel computation. The first element of the list is the L'Ecuyer
seed, which is a vector of length six or NA (if NA a default seed of
|
beta.start |
The starting values for the |
b0 |
The prior mean of |
B0 |
The prior precision of |
c0 |
|
d0 |
|
sigma.mu |
The mean of the inverse Gamma prior on
|
sigma.var |
The variacne of the inverse Gamma prior on
|
marginal.likelihood |
How should the marginal likelihood be calculated?
Options are: |
... |
further arguments to be passed. |
Details
MCMCregress
simulates from the posterior distribution using standard
Gibbs sampling (a multivariate Normal draw for the betas, and an inverse
Gamma draw for the conditional error variance). The simulation proper is
done in compiled C++ code to maximize efficiency. Please consult the coda
documentation for a comprehensive list of functions that can be used to
analyze the posterior sample.
The model takes the following form:
y_i = x_i ' \beta + \varepsilon_{i}
Where the errors are assumed to be Gaussian:
\varepsilon_{i} \sim \mathcal{N}(0, \sigma^2)
We assume standard, semi-conjugate priors:
\beta \sim \mathcal{N}(b_0,B_0^{-1})
And:
\sigma^{-2} \sim \mathcal{G}amma(c_0/2, d_0/2)
Where
\beta
and \sigma^{-2}
are assumed a
priori independent. Note that only starting values for \beta
are allowed because simulation is done using Gibbs sampling with the
conditional error variance as the first block in the sampler.
Value
An mcmc object that contains the posterior sample. This object can be summarized by functions provided by the coda package.
References
Andrew D. Martin, Kevin M. Quinn, and Jong Hee Park. 2011. “MCMCpack: Markov Chain Monte Carlo in R.”, Journal of Statistical Software. 42(9): 1-21. doi:10.18637/jss.v042.i09.
Siddhartha Chib. 1995. “Marginal Likelihood from the Gibbs Output.” Journal of the American Statistical Association. 90: 1313-1321.
Robert E. Kass and Adrian E. Raftery. 1995. “Bayes Factors.” Journal of the American Statistical Association. 90: 773-795.
Daniel Pemstein, Kevin M. Quinn, and Andrew D. Martin. 2007. Scythe Statistical Library 1.0. http://scythe.lsa.umich.edu.
Martyn Plummer, Nicky Best, Kate Cowles, and Karen Vines. 2006. “Output Analysis and Diagnostics for MCMC (CODA)”, R News. 6(1): 7-11. https://CRAN.R-project.org/doc/Rnews/Rnews_2006-1.pdf.
See Also
Examples
## Not run:
line <- list(X = c(-2,-1,0,1,2), Y = c(1,3,3,3,5))
posterior <- MCMCregress(Y~X, b0=0, B0 = 0.1,
sigma.mu = 5, sigma.var = 25, data=line, verbose=1000)
plot(posterior)
raftery.diag(posterior)
summary(posterior)
## End(Not run)