semiparaKernelEstimate {BSL}R Documentation

Estimate the semi-parametric synthetic (log) likelihood

Description

This function computes the semi-parametric synthetic likelihood estimator of (An et al. 2019). The advantage of this semi-parametric estimator over the standard synthetic likelihood estimator is that the semi-parametric one is more robust to non-normal summary statistics. Kernel density estimation is used for modelling each univariate marginal distribution, and the dependence structure between summaries are captured using a Gaussian copula. Shrinkage on the correlation matrix parameter of the Gaussian copula is helpful in decreasing the number of simulations.

Usage

semiparaKernelEstimate(
  ssy,
  ssx,
  kernel = "gaussian",
  shrinkage = NULL,
  penalty = NULL,
  log = TRUE
)

Arguments

ssy

The observed summary statisic.

ssx

A matrix of the simulated summary statistics. The number of rows is the same as the number of simulations per iteration.

kernel

A string argument indicating the smoothing kernel to pass into density for estimating the marginal distribution of each summary statistic. Only “gaussian" and “epanechnikov" are available. The default is “gaussian".

shrinkage

A string argument indicating which shrinkage method to be used. The default is NULL, which means no shrinkage is used. Current options are “glasso” for the graphical lasso method of Friedman et al. (2008) and “Warton” for the ridge regularisation method of Warton (2008).

penalty

The penalty value to be used for the specified shrinkage method. Must be between zero and one if the shrinkage method is “Warton”.

log

A logical argument indicating if the log of likelihood is given as the result. The default is TRUE.

Value

The estimated synthetic (log) likelihood value.

References

An Z, Nott DJ, Drovandi C (2019). “Robust Bayesian Synthetic Likelihood via a Semi-Parametric Approach.” Statistics and Computing (In Press).

Friedman J, Hastie T, Tibshirani R (2008). “Sparse Inverse Covariance Estimation with the Graphical Lasso.” Biostatistics, 9(3), 432–441.

Warton DI (2008). “Penalized Normal Likelihood and Ridge Regularization of Correlation and Covariance Matrices.” Journal of the American Statistical Association, 103(481), 340–349. doi: 10.1198/016214508000000021.

Friedman J, Hastie T, Tibshirani R (2008). “Sparse Inverse Covariance Estimation with the Graphical Lasso.” Biostatistics, 9(3), 432–441.

Warton DI (2008). “Penalized Normal Likelihood and Ridge Regularization of Correlation and Covariance Matrices.” Journal of the American Statistical Association, 103(481), 340–349. doi: 10.1198/016214508000000021.

Boudt K, Cornelissen J, Croux C (2012). “The Gaussian Rank Correlation Estimator: Robustness Properties.” Statistics and Computing, 22(2), 471–483. doi: 10.1007/s11222-011-9237-0.

See Also

Other available synthetic likelihood estimators: gaussianSynLike for the standard synthetic likelihood estimator, gaussianSynLikeGhuryeOlkin for the unbiased synthetic likelihood estimator, synLikeMisspec for the Gaussian synthetic likelihood estimator for model misspecification.

Examples

data(ma2)
ssy <- ma2_sum(ma2$data)
m <- newModel(fnSim = ma2_sim, fnSum = ma2_sum, simArgs = ma2$sim_args,
              theta0 = ma2$start, sumArgs = list(delta = 0.5))
ssx <- simulation(m, n = 300, theta = c(0.6, 0.2), seed = 10)$ssx

# check the distribution of the first summary statistic: highly non-normal
plot(density(ssx[, 1]))

# the standard synthetic likelihood estimator over-estimates the likelihood here
gaussianSynLike(ssy, ssx)
# the semi-parametric synthetic likelihood estimator is more robust to non-normality
semiparaKernelEstimate(ssy, ssx)
# using shrinkage on the correlation matrix of the Gaussian copula is also possible
semiparaKernelEstimate(ssy, ssx, shrinkage = "Warton", penalty = 0.8)


[Package BSL version 3.2.5 Index]