update.dynaTree {dynaTree}R Documentation

Updating a Dynamic Tree Model With New Data

Description

Updating an already-initialized dynamic tree model with new input/output pairs, primarily to facilitate sequential design and optimization applications

Usage

## S3 method for class 'dynaTree'
update(object, X, y, verb = round(length(y)/10), ...)

Arguments

object

a "dynaTree"-class object built by dynaTree

X

an augmenting design matrix of real-valued predictors with ncol(X) = object$m

y

an augmenting vector of real-valued responses or integer categories with length(y) = nrow(X)

verb

a positive scalar integer indicating how many time steps (iterations) after which a progress statement should be printed to the console; a value of verb = 0 is quiet

...

to comply with the generic predict method – currently unused

Details

This function updates the dynaTree fit with new (X,y) pairs by the Particle Learning (PL) algorithm. The updated fit will be for data combined as rbind(object$X, X) and c(object$y, y).

The primary use of this function is to facilitate sequential design by optimization and active learning. Typically one would use predict.dynaTree to estimate active learning statistics at candidate location. These are used to pick new (X,y) locations to add to the design – the new fit being facilitated by this function; see the examples below

Value

The returned list is the same as dynaTree – i.e., a "dynaTree"-class object

Note

The object (object) must contain a pointer to a particle cloud (object$num) which has not been deleted by deletecloud. In particular, it cannot be an object returned from dynaTrees

Author(s)

Robert B. Gramacy rbg@vt.edu,
Matt Taddy and Christoforos Anagnostopoulos

References

Taddy, M.A., Gramacy, R.B., and Polson, N. (2011). “Dynamic trees for learning and design” Journal of the American Statistical Association, 106(493), pp. 109-123; arXiv:0912.1586

Anagnostopoulos, C., Gramacy, R.B. (2013) “Information-Theoretic Data Discarding for Dynamic Trees on Data Streams.” Entropy, 15(12), 5510-5535; arXiv:1201.5568

Carvalho, C., Johannes, M., Lopes, H., and Polson, N. (2008). “Particle Learning and Smoothing”. Discussion Paper 2008-32, Duke University Dept. of Statistical Science.

https://bobby.gramacy.com/r_packages/dynaTree/

See Also

predict.dynaTree, dynaTree, plot.dynaTree, deletecloud, getBF

Examples

## simple function describing (x,y) data
f1d <- function(x, sd=0.1){
  return( sin(x) - dcauchy(x,1.6,0.15) + rnorm(1,0,sd))
} 

## initial (x,y) data
X <- seq(0, 7, length=30)
y <- f1d(X)

## PL fit to initial data
obj <- dynaTree(X=X, y=y, N=1000, model="linear")

## a predictive grid
XX <- seq(0,7, length=100)
obj <- predict(obj, XX, quants=FALSE)

## follow the ALM algorithm and choose the next
## point with the highest predictive variance
m <- which.max(obj$var)
xstar <- drop(obj$XX[m,])
ystar <- f1d(xstar)

## plot the next chosen point
par(mfrow=c(2,1))
plot(obj, ylab="y", xlab="x", main="fitted surface")
points(xstar, ystar, col=3, pch=20)
plot(obj$XX, sqrt(obj$var), type="l", xlab="x",
     ylab="predictive sd", main="active learning")

## update the fit with (xstar, ystar)
obj <- update(obj, xstar, ystar)

## new predictive surface
obj <- predict(obj, XX, quants=FALSE)

## plotted
plot(obj, ylab="y", xlab="x", main="updated fitted surface")
plot(obj$XX, sqrt(obj$var), type="l", xlab="x",
     ylab="predictive sd", main="active learning")

## delete the cloud to prevent a memory leak
deletecloud(obj); obj$num <- NULL

## see demo("design") for more iterations and
## design under other active learning heuristics
## like ALC, and EI for optimization; also see
## demo("online") for an online learning example

[Package dynaTree version 1.2-16 Index]