fhc {SELF} | R Documentation |
Fast Hill-Climbing
Description
The function for the causal structure learning.
Usage
fhc(D, G = NULL, min_increase = 0.01, score_type = "bic", file = "",
verbose = TRUE, save_model = FALSE, bw = "nrd0", booster = "gbtree",
gamma = 10, nrounds = 30, ...)
Arguments
D |
Input Data. |
G |
An initial graph for hill climbing. Default: empty graph. |
min_increase |
Minimum score increase for faster convergence. |
score_type |
You can choose "bic","log","aic" score to learn the causal struture. Default: bic |
file |
Specifies the output folder and its path to save the model at each iteration. |
verbose |
Show the progress bar for each iteration. |
save_model |
Save the meta data during the iteration so that you can easily restore progress and evaluate the model during iteration. |
bw |
the smoothing bandwidth which is the parameter of the function stats::density(Kernel stats::density Estimation) |
booster |
Choose the regression method, it could be "lm", "gbtree" and "gblinear". The "lm" and "gblinear" is the linear regression methods and "gbtree" is the nonlinear regression method. Default: gbtree |
gamma |
The parameter in xgboost: minimum loss reduction required to make a further partition on a leaf node of the tree. the larger, the more conservative the algorithm will be. |
nrounds |
the maximum number of trees for xgboost.Default:30. |
... |
other parameters for xgboost.see also: help(xgboost) |
Value
The adjacency matrix of the casual structure.
Examples
## Not run:
#x->y->z
set.seed(0)
x=rnorm(4000)
y=x^2+runif(4000,-1,1)*0.1
z=y^2+runif(4000,-1,1)*0.1
data=data.frame(x,y,z)
fhc(data,gamma=10,booster = "gbtree")
#x->y->z linear data
set.seed(0)
x=rnorm(4000)
y=3*x+runif(4000,-1,1)*0.1
z=3*y+runif(4000,-1,1)*0.1
data=data.frame(x,y,z)
fhc(data,booster = "lm")
#randomGraph with linear data
set.seed(0)
G=randomGraph(dim=10,indegree=1.5)
data=synthetic_data_linear(G=G,sample_num=4000)
fitG=fhc(data,booster = "lm")
indicators(fitG,G)
## End(Not run)