HS.post.mean {horseshoe} | R Documentation |
Posterior mean for the horseshoe for the normal means problem.
Description
Compute the posterior mean for the horseshoe for the normal means problem (i.e. linear regression with the design matrix equal to the identity matrix), for a fixed value of tau, without using MCMC, leading to a quick estimate of the underlying parameters (betas). Details on computation are given in Carvalho et al. (2010) and Van der Pas et al. (2014).
Usage
HS.post.mean(y, tau, Sigma2 = 1)
Arguments
y |
The data. An |
tau |
Value for tau. Warning: tau should be greater than 1/450. |
Sigma2 |
The variance of the data. |
Details
The normal means model is:
y_i=\beta_i+\epsilon_i, \epsilon_i \sim N(0,\sigma^2)
And the horseshoe prior:
\beta_j \sim N(0,\sigma^2 \lambda_j^2 \tau^2)
\lambda_j \sim Half-Cauchy(0,1).
If \tau
and \sigma^2
are known, the posterior mean can be computed without
using MCMC.
Value
The posterior mean (horseshoe estimator) for each of the datapoints.
References
Carvalho, C. M., Polson, N. G., and Scott, J. G. (2010), The horseshoe estimator for sparse signals. Biometrika 97(2), 465–480.
van der Pas, S. L., Kleijn, B. J. K., and van der Vaart, A. W. (2014), The horseshoe estimator: Posterior concentration around nearly black vectors. Electronic Journal of Statistics 8(2), 2585–2618.
See Also
HS.post.var
to compute the posterior variance. See
HS.normal.means
for an implementation that does use MCMC, and
returns credible intervals as well as the posterior mean (and other quantities).
See horseshoe
for linear regression.
Examples
#Plot the posterior mean for a range of deterministic values
y <- seq(-5, 5, 0.05)
plot(y, HS.post.mean(y, tau = 0.5, Sigma2 = 1))
#Example with 20 signals, rest is noise
#Posterior mean for the signals is plotted in blue
truth <- c(rep(0, 80), rep(8, 20))
data <- truth + rnorm(100)
tau.example <- HS.MMLE(data, 1)
plot(data, HS.post.mean(data, tau.example, 1),
col = c(rep("black", 80), rep("blue", 20)))