ALM_RNAmf {RNAmf}R Documentation

find the next point by ALM criterion

Description

The function acquires the new point by the Active learning MacKay (ALM) criterion. It calculates the ALM criterion \frac{\sigma^{*2}_l(\bm{x})}{\sum^l_{j=1}C_j}, where \sigma^{*2}_l(\bm{x}) is the posterior predictive variance at each fidelity level l and C_j is the simulation cost at level j. For details, see Heo and Sung (2023+, <arXiv:2309.11772>).

A new point is acquired on Xcand. If Xcand=NULL, a new point is acquired on unit hypercube [0,1]^d.

Usage

ALM_RNAmf(Xcand = NULL, fit, cost = NULL, optim = TRUE, parallel = FALSE, ncore = 1)

Arguments

Xcand

vector or matrix of candidate set which could be added into the current design only used when optim=FALSE. Xcand is the set of the points where ALM criterion is evaluated. If Xcand=NULL, 100 \times d number of points from 0 to 1 are generated by Latin hypercube design. Default is NULL.

fit

object of class RNAmf.

cost

vector of the costs for each level of fidelity. If cost=NULL, total costs at all levels would be 1. cost is encouraged to have a ascending order of positive value. Default is NULL.

optim

logical indicating whether to optimize AL criterion by optim's gradient-based L-BFGS-B method. If optim=TRUE, 5 \times d starting points are generated by Latin hypercube design for optimization. If optim=FALSE, AL criterion is optimized on the Xcand. Default is TRUE.

parallel

logical indicating whether to compute the AL criterion in parallel or not. If parallel=TRUE, parallel computation is utilized. Default is FALSE.

ncore

a number of core for parallel. It is only used if parallel=TRUE. Default is 1.

Value

Examples


library(lhs)
library(doParallel)
library(foreach)

### simulation costs ###
cost <- c(1, 3)

### 1-d Perdikaris function in Perdikaris, et al. (2017) ###
# low-fidelity function
f1 <- function(x) {
  sin(8 * pi * x)
}

# high-fidelity function
f2 <- function(x) {
  (x - sqrt(2)) * (sin(8 * pi * x))^2
}

### training data ###
n1 <- 13
n2 <- 8

### fix seed to reproduce the result ###
set.seed(1)

### generate initial nested design ###
X <- NestedX(c(n1, n2), 1)
X1 <- X[[1]]
X2 <- X[[2]]

### n1 and n2 might be changed from NestedX ###
### assign n1 and n2 again ###
n1 <- nrow(X1)
n2 <- nrow(X2)

y1 <- f1(X1)
y2 <- f2(X2)

### n=100 uniform test data ###
x <- seq(0, 1, length.out = 100)

### fit an RNAmf ###
fit.RNAmf <- RNAmf_two_level(X1, y1, X2, y2, kernel = "sqex")

### predict ###
predy <- predict(fit.RNAmf, x)$mu
predsig2 <- predict(fit.RNAmf, x)$sig2

### active learning with optim=TRUE ###
alm.RNAmf.optim <- ALM_RNAmf(
  Xcand = x, fit.RNAmf, cost = cost,
  optim = TRUE, parallel = TRUE, ncore = 2
)
print(alm.RNAmf.optim$time) # computation time of optim=TRUE

### active learning with optim=FALSE ###
alm.RNAmf <- ALM_RNAmf(
  Xcand = x, fit.RNAmf, cost = cost,
  optim = FALSE, parallel = TRUE, ncore = 2
)
print(alm.RNAmf$time) # computation time of optim=FALSE

### visualize ALM ###
oldpar <- par(mfrow = c(1, 2))
plot(x, alm.RNAmf$ALM$ALM1,
  type = "l", lty = 2,
  xlab = "x", ylab = "ALM criterion at the low-fidelity level",
  ylim = c(min(c(alm.RNAmf$ALM$ALM1, alm.RNAmf$ALM$ALM2)),
           max(c(alm.RNAmf$ALM$ALM1, alm.RNAmf$ALM$ALM2)))
)
points(alm.RNAmf$chosen$Xnext,
  alm.RNAmf$ALM$ALM1[which(x == drop(alm.RNAmf$chosen$Xnext))],
  pch = 16, cex = 1, col = "red"
)
plot(x, alm.RNAmf$ALM$ALM2,
  type = "l", lty = 2,
  xlab = "x", ylab = "ALM criterion at the high-fidelity level",
  ylim = c(min(c(alm.RNAmf$ALM$ALM1, alm.RNAmf$ALM$ALM2)),
           max(c(alm.RNAmf$ALM$ALM1, alm.RNAmf$ALM$ALM2)))
)
par(oldpar)


[Package RNAmf version 0.1.2 Index]