plot_Kblist {EntropyMCMC} | R Documentation |
Plot sequences of Kullback distance estimates for comparison of several MCMC algorithms for a same target density
Description
This function draws on a same plot several sequences of estimates of
Kullback distances K(p^t,f)
, i.e. the convergence criterion vs. time (iteration t
),
for each MCMC algorithm for which the convergence criterion has been computed.
Usage
plot_Kblist(Kb, which = 1, lim = NULL, ylim = NULL)
Arguments
Kb |
A list of objects of class |
which |
Controls the level of details in the legend added to the plot (see details) |
lim |
for zooming over |
ylim |
limits on the |
Details
The purpose of this plot if to compare K
MCMC algorithms (typically based on K
different
simulation strategies or kernels) for convergence or efficiency in estimating a same target density f
.
For the k
th algorithm, the user has to generate the convergence criterion,
i.e. the sequence K(p^t(_k)k), f)
for t=1
up to the number of iterations
that has been chosen, and where p^t(k)
is the estimated pdf of the algorithm at time t
.
For the legend, which=1
displays the MCMC's names together with some technical information depending on the algorithms definition (e.g. the proposal variance for the RWHM
algorithm) and the
method used for entropy estimation. The legend for
which=2
is shorter, only displaying the MCMC's names together with the number of parallel chains used for each,
typically to compare the effect of that number for a single MCMC algorithm.
Value
The graphic to plot.
Author(s)
Didier Chauveau.
References
Chauveau, D. and Vandekerkhove, P. (2012), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, (2013) 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
See Also
Examples
## Toy example using the bivariate centered gaussian target
## with default parameters value, see target_norm_param
d = 2 # state space dimension
n=300; nmc=100 # number of iterations and iid Markov chains
## initial distribution, located in (2,2), "far" from target center (0,0)
Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1)
## MCMC 1: Random-Walk Hasting-Metropolis
varq=0.05 # variance of the proposal (chosen too small)
q_param=list(mean=rep(0,d),v=varq*diag(d))
## using Method 1: simulation with storage, and *then* entropy estimation
# simulation of the nmc iid chains, single core here
s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm,
target_norm_param, q_param)
summary(s1) # method for "plMCMC" object
e1 <- EntropyMCMC(s1) # computes Entropy and Kullback divergence
## MCMC 2: Independence Sampler with large enough gaussian proposal
varq=1; q_param <- list(mean=rep(0,d),v=varq*diag(d))
## using Method 2: simulation & estimation for each t, forgetting the past
## HPC with 2 cores here (using parallel socket cluser, not available on Windows machines)
e2 <- EntropyParallel.cl(HMIS_norm, n, nmc, Ptheta0, target_norm,
target_norm_param, q_param,
cltype="PAR_SOCK", nbnodes=2)
## Compare these two MCMC algorithms
plot_Kblist(list(e1,e2)) # MCMC 2 (HMIS, red plot) converges faster.