influence.gp {kergp} | R Documentation |
Diagnostics for a Gaussian Process Model, Based on Leave-One-Out
Description
Cross Validation by leave-one-out for a gp
object.
Usage
## S3 method for class 'gp'
influence(model, type = "UK", trend.reestim = TRUE, ...)
Arguments
model |
An object of class |
type |
Character string corresponding to the GP "kriging" family, to be
chosen between simple kriging ( |
trend.reestim |
Should the trend be re-estimated when removing an observation?
Default to |
... |
Not used. |
Details
Leave-one-out (LOO) consists in computing the prediction at a design point when the corresponding observation is removed from the learning set (and this, for all design points). A quick version of LOO based on Dubrule's formula is also implemented; It is limited to 2 cases:
(type == "SK") & !trend.reestim
and(type == "UK") & trend.reestim
.
Value
A list composed of the following elements, where n is the total number of observations.
mean |
Vector of length n. The |
sd |
Vector of length n. The |
Warning
Only trend parameters are re-estimated when removing one
observation. When the number n
of observations is small, the
re-estimated values can be far away from those obtained with the
entire learning set.
Author(s)
O. Roustant, D. Ginsbourger.
References
F. Bachoc (2013), "Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification". Computational Statistics and Data Analysis, 66, 55-69 link
N.A.C. Cressie (1993), Statistics for spatial data. Wiley series in probability and mathematical statistics.
O. Dubrule (1983), "Cross validation of Kriging in a unique neighborhood". Mathematical Geology, 15, 687-699.
J.D. Martin and T.W. Simpson (2005), "Use of kriging models to approximate deterministic computer models". AIAA Journal, 43 no. 4, 853-863.
M. Schonlau (1997), Computer experiments and global optimization. Ph.D. thesis, University of Waterloo.