EntropyMCMC-package {EntropyMCMC} R Documentation

## (A)MCMC Simulation and Convergence Evaluation using Entropy and Kullback-Leibler Divergence Estimation

### Description

Contains functions to analyse (Adaptive) Markov Chain Monte Carlo (MCMC) algorithms, evaluate their convergence rate, and compare candidate MCMC algorithms for a same target density, based on entropy and Kullback-Leibler divergence criteria. MCMC algorithms can be simulated using provided functions, or imported from external codes. The diagnostics are based on consistent estimates of entropy and Kulback distance between the density at iteration t and the target density f, based on iid (parallel) chains.

### Details

 Package: EntropyMCMC Type: Package Version: 1.0.4 Date: 2019-03-08 License: GPL (>= 3) LazyLoad: yes

Statistical background:

This package allows for simulation of standard or adaptive MCMC samplers for a user-defined target density, and provides statistical tools to evaluate convergence of MCMC's and compare performance of algorithms for the same target density (typically against benchmark samplers).

The criteria are graphical and based on plots against iterations (time) t, of the Kullback divergence K(p^t,f) between the density p^t of the MCMC algorithm at time t, and the target density f, for t=1 up to the number of iterations that have been simulated. This requires estimation of the entropy of p^t,

E_{p^t} [\log(p^t)],

and of the external entropy

E_{p^t} [\log(f)].

Consistent estimates are computed based on N iid (parallel) chains, since the N positions of the chains at iterations t forms a N-iid sample from the density p^t.

Computational considerations:

The simulation of iid chains can be performed in this package, which provides a mechanism for defining (A)MCMC algorithms and building the iid chains required for convergence evaluation. Each MCMC algorithm is defined by a list with five elements. Each user can define its own MCMC, starting from the standard MCMC algorithms that are already defined:

• RWHM: a standard Randow-Walk Hastings-Metropolis (HM) algorithm.

• HMIS_norm: an Independence Sampler HM with gaussian proposal

• AMHaario: the Haario (2001) Adaptive Hastings-Metropolis algorithm, provided as an example of a standard AMCMC.

• IID_norm: a “fake” MCMC that is just a gaussian IID sampler, used mostly for testing purpose. Simulation of N iid chains for n iterations using this algorithm just returns N\times n gaussian d-dimensional vectors.

Functions for doing the simulations and the convergence evaluation automatically using these algorithms in their first argument are provided. Two strategies are available:

• Simulation and Kullback estimation separately: A “cube” of N chains for n iterations in a space of dimension d is first simulated and stored using MCMCcopies or its multicore or cluser versions, then the entropy and Kullback divergence are estimated from that object using EntropyMCMC or its multicore version.

• Simulation and Kullback estimation simultaneously: For each iteration t, the next step of all the N chains are generated, then the Entropy and Kullback divergence K(p^t,f) are estimated, and the past of the parallel chains is discarded so that the amount of memory requirement is kept small, and only entropy-related estimates are stored and returned. Functions for this strategy are EntropyParallel and its multicore and cluster version.

See the Examples section of plot_Kblist for an illustration of these two methods.

Doing the simulations outside from this package

A third hybrid strategy is also available: the simulation of iid chains can be done using an external code (in R, C or any language) and imported in the EntropyMCMC package (defining an object of the appropriate class "plMCMC" and structure, see MCMCcopies).

Then the Kullback divergence criterion can be computed using EntropyMCMC or its multicore version, and convergence/comparison diagnostics can be displayed using the associated plot method.

The required simulations can be done using singlecore or HCP (multicore computers, snow or clusters using the parallel or Rmpi pakages). Note that the parallel package using socket cluster is not available on Windows machines.

### Author(s)

Didier Chauveau, Institut Denis Poisson, University of Orleans, CNRS, Orleans France. https://www.idpoisson.fr/chauveau/

Maintainer: Didier Chauveau didier.chauveau@univ-orleans.fr

Contributor: Houssam Alrachid

### References

• Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004

• Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.

• Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.

[Package EntropyMCMC version 1.0.4 Index]