censoredLikelihoodBR {mvPot} | R Documentation |
Censored log-likelihood function for the Brown–Resnick model.
Description
Compute the peaks-over-threshold censored negative log-likelihood function for the Brown–Resnick model.
Usage
censoredLikelihoodBR(
obs,
loc,
vario,
u,
p = 499L,
vec = NULL,
nCores = 1L,
cl = NULL,
likelihood = "mgp",
ntot = NULL,
...
)
censoredLikelihood(
obs,
loc,
vario,
u,
p = 499L,
vec = NULL,
nCores = 1L,
cl = NULL
)
Arguments
obs |
List of vectors for which at least one component exceeds a high threshold. |
loc |
Matrix of coordinates as given by |
vario |
Semi-variogram function taking a vector of coordinates as input. |
u |
Vector of threshold under which to censor components. |
p |
Number of samples used for quasi-Monte Carlo estimation. Must be a prime number. |
vec |
Generating vector for the quasi-Monte Carlo procedure. For a given prime |
nCores |
Number of cores used for the computation |
cl |
Cluster instance as created by |
likelihood |
vector of strings specifying the contribution. Either |
ntot |
integer number of observations below and above the threshold, to be used with Poisson or binomial likelihood |
... |
Additional arguments passed to Cpp routine. |
Details
The function computes the censored negative log-likelihood function based on the representation developed by Wadsworth et al. (2014) and Engelke et al. (2015). Margins must have been standardized first, for instance to the unit Frechet scale.
Value
Negative censored log-likelihood for the set of observations obs
and semi-variogram vario
with attributes
exponentMeasure
for all of the likelihood
type selected, in the order "mgp"
, "poisson"
, "binom"
.
Author(s)
Raphael de Fondeville
References
Wadsworth, J. L. and J. A. Tawn (2014). Efficient inference for spatial extreme value processes associated to log-Gaussian random functions. Biometrika, 101(1), 1-15.
Asadi, P., Davison A. C. and S. Engelke (2015). Extremes on River Networks. Annals of Applied Statistics, 9(4), 2023-2050.
Examples
#Define semi-variogram function
vario <- function(h){
0.5 * norm(h, type = "2")^1.5
}
#Define locations
loc <- expand.grid(1:4, 1:4)
#Simulate data
obs <- simulPareto(1000, loc, vario)
#Evaluate risk functional
maxima <- sapply(obs, max)
thres <- quantile(maxima, 0.9)
#Select exceedances
exceedances <- obs[maxima > thres]
#Compute generating vector
p <- 499
latticeRule <- genVecQMC(p, (nrow(loc) - 1))
primeP <- latticeRule$primeP
vec <- latticeRule$genVec
#Compute log-likelihood function
censoredLikelihoodBR(obs = exceedances, loc = loc, vario = vario,
u = thres, p = primeP, vec = vec, ntot = 1000)