BayesTreePrior {BayesTreePrior} | R Documentation |
Simulation of the tree prior.
Description
This is the main function to use for simulating from the prior. There are 4 cases :
Case #1: Unrealistic case where we assume that the number of variables and possible splits are infinite (therefore
P(T)
is not dependent on the design matrix X) and\beta=0
Case #2: Unrealistic case where we assume that the number of variables and possible splits are infinite (therefore
P(T)
is not dependent on the design matrix X)Case #3: One variable with a finite number of observations (Seems to be equivalent to the multiple variables case when all variables are continuous)
Case #4: General case
Case #1 will be used if no design matrix X or number of observations is given and \beta = 0
. Case #2 will be used if no design matrix X or number of observations is given and \beta \neq 0
. Case #3 will be used if no design matrix X is given but the number of observations is given. Case #4 will be used if the design matrix X is given. Note that case #4 is always slower, so if all your variables are continuous, it would be advisable to enter the number of uniques observations rather than the design matrix X, to be able to use case #3.
Usage
BayesTreePrior(alpha, beta, X = NULL, n_obs = NULL, n_iter = 500,
minpart = 1, package = NULL, pvars = NULL, MIA = FALSE,
missingdummy = FALSE)
Arguments
alpha |
base parameter of the tree prior, |
beta |
power parameter of the tree prior, |
X |
data.frame of the design matrix (Required for case #4). |
n_obs |
number of unique observations, |
n_iter |
number of trees to generate, |
minpart |
the minimum number of observations required in one of the child to be able to split, |
package |
a optional string that can take the following values : "BayesTree", "tgp" or "bartMachine". It forces the algorithm to use the default paramameters used by the package specified ( |
pvars |
vector of probabilities for the choices of variables to split (Will automatically be normalized so that the sum equal to 1). It must be twice as large as the number of variables when |
MIA |
set to TRUE if you want Missing Incorporated in Attributes (MIA) imputation to be used. |
missingdummy |
set to TRUE if you want the NAs to be dummy coded. |
Value
In case #1, it returns a list containing, in the following order: the expectation and the variance of the number of bottom nodes. In cases #2, #3 or #4, it returns a list containing, in the following order: the mean number of bottom nodes, the standard deviation of the number of bottom nodes, the mean of the depth, the standard deviation of the depth and a data.frame of vectors (b_i,d_i)
, where b_i
is the number of bottom nodes and d_i
is the depth of the i
th generated tree (i=1, \ldots ,n_{iter}
).
References
Chipman, H. A., George, E. I., & McCulloch, R. E. (1998). Bayesian CART model search. Journal of the American Statistical Association, 93(443), 935-948.
Gramacy, R. B. (2007). tgp: an R package for Bayesian nonstationary, semiparametric nonlinear regression and design by treed Gaussian process models. Journal of Statistical Software, 19(9), 6.
Chipman, H. A., George, E. I., & McCulloch, R. E. (2010). BART: Bayesian additive regression trees. The Annals of Applied Statistics, 266-298.
Kapelner, A., & Bleich, J. (2013). bartMachine: A powerful tool for machine learning. stat, 1050, 8.
Twala, B. E. T. H., Jones, M. C., & Hand, D. J. (2008). Good methods for coping with missing data in decision trees. Pattern Recognition Letters, 29(7), 950-956.
Jolicoeur-Martineau, A. (expected 2016) Etude d'une loi a priori pour les arbres binaires de regression (Study on the prior distribution of binary regression trees) (Master thesis). UQAM university.
Examples
#Case 1 : Unrealistic case where we assume that the number of var/obs is infinite and beta=0
results1 = BayesTreePrior(.45,0)
#Case 2 : Unrealistic case where we assume that the number of var/obs is infinite
results2 = BayesTreePrior(.95,.5)
#Case 3 : One variable with a finite number of observations
results3 = BayesTreePrior(.95,.5,n_obs=150)
if (requireNamespace("MASS", quietly = TRUE)) {
#Case 4 : General case, without missing data
x1 = MASS::mcycle$times
x2= MASS::mcycle$accel
X = cbind(x1, x2)
results4_nomiss = BayesTreePrior(.95,.5, data.frame(X), minpart=5, package="tgp")
#Case 4 : General case, with missing data
x1[sample(1:length(x1), 20)] <- NA
x2[sample(1:length(x2), 20)] <- NA
X = cbind(x1, x2)
results4_miss = BayesTreePrior(.95,.5, data.frame(X), minpart=5, package="tgp",
MIA=TRUE, missingdummy=TRUE)
}