pred_noisy_input {hetGP} | R Documentation |
Gaussian process prediction prediction at a noisy input x
, with centered Gaussian noise of variance sigma_x
.
Several options are available, with different efficiency/accuracy tradeoffs.
Description
Gaussian process prediction prediction at a noisy input x
, with centered Gaussian noise of variance sigma_x
.
Several options are available, with different efficiency/accuracy tradeoffs.
Usage
pred_noisy_input(x, model, sigma_x, type = c("simple", "taylor", "exact"))
Arguments
x |
design considered |
model |
GP |
sigma_x |
input variance |
type |
available options include
|
Note
Beta version.
References
A. McHutchon and C.E. Rasmussen (2011),
Gaussian process training with input noise,
Advances in Neural Information Processing Systems, 1341-1349.
A. Girard, C.E. Rasmussen, J.Q. Candela and R. Murray-Smith (2003), Gaussian process priors with uncertain inputs application to multiple-step ahead time series forecasting, Advances in Neural Information Processing Systems, 545-552.
Examples
################################################################################
### Illustration of prediction with input noise
################################################################################
## noise std deviation function defined in [0,1]
noiseFun <- function(x, coef = 1.1, scale = 0.25){
if(is.null(nrow(x))) x <- matrix(x, nrow = 1)
return(scale*(coef + sin(x * 2 * pi)))
}
## data generating function combining mean and noise fields
ftest <- function(x, scale = 0.25){
if(is.null(nrow(x))) x <- matrix(x, ncol = 1)
return(f1d(x) + rnorm(nrow(x), mean = 0, sd = noiseFun(x, scale = scale)))
}
ntest <- 101; xgrid <- seq(0,1, length.out = ntest); Xgrid <- matrix(xgrid, ncol = 1)
set.seed(42)
Xpred <- Xgrid[rep(1:ntest, each = 100),,drop = FALSE]
Zpred <- matrix(ftest(Xpred), byrow = TRUE, nrow = ntest)
n <- 10
N <- 20
X <- matrix(seq(0, 1, length.out = n))
if(N > n) X <- rbind(X, X[sample(1:n, N-n, replace = TRUE),,drop = FALSE])
X <- X[order(X[,1]),,drop = FALSE]
Z <- apply(X, 1, ftest)
par(mfrow = c(1, 2))
plot(X, Z, ylim = c(-10,15), xlim = c(-0.1,1.1))
lines(xgrid, f1d(xgrid))
lines(xgrid, drop(f1d(xgrid)) + 2*noiseFun(xgrid), lty = 3)
lines(xgrid, drop(f1d(xgrid)) - 2*noiseFun(xgrid), lty = 3)
model <- mleHomGP(X, Z, known = list(beta0 = 0))
preds <- predict(model, Xgrid)
lines(xgrid, preds$mean, col = "red", lwd = 2)
lines(xgrid, preds$mean - 2*sqrt(preds$sd2), col = "blue")
lines(xgrid, preds$mean + 2*sqrt(preds$sd2), col = "blue")
lines(xgrid, preds$mean - 2*sqrt(preds$sd2 + preds$nugs), col = "blue", lty = 2)
lines(xgrid, preds$mean + 2*sqrt(preds$sd2 + preds$nugs), col = "blue", lty = 2)
sigmax <- 0.1
X1 <- matrix(0.5)
lines(xgrid, dnorm(xgrid, X1, sigmax) - 10, col = "darkgreen")
# MC experiment
nmc <- 1000
XX <- matrix(rnorm(nmc, X1, sigmax))
pxx <- predict(model, XX)
YXX <- rnorm(nmc, mean = pxx$mean, sd = sqrt(pxx$sd2 + pxx$nugs))
points(XX, YXX, pch = '.')
hh <- hist(YXX, breaks = 51, plot = FALSE)
dd <- density(YXX)
plot(hh$density, hh$mids, ylim = c(-10, 15))
lines(dd$y, dd$x)
# GP predictions
pin1 <- pred_noisy_input(X1, model, sigmax^2, type = "exact")
pin2 <- pred_noisy_input(X1, model, sigmax^2, type = "taylor")
pin3 <- pred_noisy_input(X1, model, sigmax^2, type = "simple")
ygrid <- seq(-10, 15,, ntest)
lines(dnorm(ygrid, pin1$mean, sqrt(pin1$sd2)), ygrid, lty = 2, col = "orange")
lines(dnorm(ygrid, pin2$mean, sqrt(pin2$sd2)), ygrid, lty = 2, col = "violet")
lines(dnorm(ygrid, pin3$mean, sqrt(pin3$sd2)), ygrid, lty = 2, col = "grey")
abline(h = mean(YXX), col = "red") # empirical mean
par(mfrow = c(1, 1))