> '` ?bjbjLULU.>.?.??
T
L%2huuu@%B%B%B%B%B%B%'h)B%uuB%W%0@%@%rH!TH"&_%
!@%%<%! +^L +H" +H"u;DuuuB%B%
uuu%DB DB Technical description of Stage 1
Denote by QUOTE the QUOTE matrix corresponding to the data set of interest where QUOTE is the number of individuals, QUOTE is the binary dependent variable and QUOTE is the QUOTE matrix containing the predictor variables.
Stage 1. Iterative elimination process:
First step:
Denote by QUOTE the initial data set QUOTE and by QUOTE .
Build a Random Forest QUOTE on QUOTE , that is, using all predictor variables and the response (see subsection Random Forest parameters below).
Obtain the ranking of the predictor variables using the chosen measure of importance (see subsection Random Forest importance measures for details). Denote by QUOTE the ranking vector of variables QUOTE
Compute the out-of-bag AUC (OOB-AUC) of the Random Forest QUOTE , namely OOB-AUC1 (see subsection Random Forest prediction and AUC computation for details).
Subsequent steps. Step j, j > 1:
Based on the initial ranking r, remove a fraction (by default 20%) of the less important variables from QUOTE and denote the resulting matrix of predictors as QUOTE .
Denote by QUOTE the reduced data set: QUOTE
Build a Random Forest on QUOTE , namely QUOTE .
Compute the OOB-AUC of the Random Forest QUOTE , namely OOB-AUCj .
Repeat step j until
the number of remaining variables is less or equal than k0 (by default k0 = 1).
Technical description of Stage 4
For ease of notation we illustrate this process in the case of a 5-foldcross-validation (CV) process that is repeated 20 times.
For m = 1, ..., M = 20 repeat a 5-fold CV process consisting of the following steps:
Divide the original data set into 5 subsets: QUOTE , QUOTE
For j = 1, ..., J = 5
Perform the AUC-RF feature selection on the learning data set, QUOTE .
Let QUOTE denote the optimal Random Forest (after feature elimination) and QUOTE the set of selected variables.
Use QUOTE to predict individuals in the test data set QUOTE (See subsection Random Forest prediction and AUC computation). This provides a vector of probabilities, QUOTE , corresponding to the proportion of trees yielding Y = 1.
Join the predictions of the 5 CV subsets, QUOTE , QUOTE , and compute the AUC of these predictions, denoted by CV-AUCm.
Compute the mean QUOTE .
For each variable QUOTE , compute its probability of selection as the proportion of times that it has been selected by the AUC-RF method:
Random Forest parameters
AUC-RF uses Random Forest with the default parameters of the R-package randomForest. The most relevant specifications are ntree = 500 (the number of trees in a forest is 500), mtry = QUOTE (the number of selected candidate variables in each node is the squared of the total number of variables considered in the current forest) and replace = TRUE, node size = 1, max. nodes = NULL, importance = FALSE, norm.votes = TRUE (see the randomForest documentation for details). The out-of-bag process of the AUC-RF is as in the standard use of RF: bootstrap samples are obtained with replacement, thus about one third of the cases are left out in each tree. These default values can be modified when the randomForest function is called.
",-456789>?FGHIJKlmѱѕх}ujGhHkUjNhHkUjQFhHkUjNhHkUjWhHkUjZPhHkUh\*OJQJmH sH j
hHkUjhHkUjShHkUjhHkUjh7 hHkOJQJUh\*hHkOJQJmH sH h\*hHk5OJQJmH sH ."1 Y f X
VI
&FEƀ*fgdHkI
&FEƀ*fgdHk
^gdHkm$gdHkgdHkm$? 1 d f p q x y z { | } ٫ٛٶ٫كٶ٫{sٶ٫kjhHkUjhHkUj>hHkUjhHkUj2hHkUh\*OJQJj@hHkUj2hHkUh7 hHkOJQJh8 hHkOJQJh\*hHk6OJQJmH sH jhHkUjh7 hHkOJQJUh\*hHkOJQJmH sH j:HhHkU( 7
N
&'./0123novwxyz{ݶݶҮҞҎzݶݶh8 hHkH*OJQJjhHkUjNhHkUjEhHkUjhHkUjhHkUj\>hHkUh8 hHk6OJQJjhHkUj^hHkUh7 hHkOJQJh8 hHkOJQJjh7 hHkOJQJUjChHkU,X
4kfgdHkI
&FEƀ*fgdHkI
&FEƀ*fgdHkbcjklmno
ŽеŭХŝЕߍŅ}ujBhHkUjhHkUj9hHkUh\*OJQJjI
hHkUj
hHkUjc?
hHkUj hHkUj\ hHkUjD hHkUh7 hHkOJQJjh7 hHkOJQJUh8 hHkOJQJh8 hHk6OJQJh\*6OJQJ.*
k!I
&FEƀ*fgdHkI
&FEƀ*fgdHkI
&FEƀ*fgdHk
#
$
%
&
'
(
T
U
\
]
^
_
`
a
q
r
u
W\Һݦݚݚݚ݁rfrh\*OJQJmH sH h\*hHkOJQJmH sH h\*hHk5OJQJmH sH h\*6OJQJh8 hHk6OJQJh8 hHkH*OJQJjc
hHkUj9
hHkUjhHkUjJhHkUh7 hHkOJQJh8 hHkOJQJjh7 hHkOJQJUj>hHkU$*
u
a[[[gdHkm$I
&FEƀ*fgdHk
^gdHkm$I
&FEƀ*fgdHk
$%&'0123opuv}~ӸȰӨȠӘȐӈj,7hHkUjhHkUjhHkUj3hHkUjhHkUjhHkUjj5hHkUj
hHkUh7 hHkOJQJjh7 hHkOJQJUh8 h\*OJQJh8 hHkOJQJh\*OJQJ5\Ld7$8$H$^gdHkU
&Fd7$8$Eƀ*f.H$^gdHkM
&F8Eƀ*f^8gdHk56HR
&F7$8$Eƀ*foH$^gdHkd7$8$H$^gdHkU
&Fd7$8$Eƀ*f.H$^gdHk
?@DELMNOPQXYc"%./6ڪڃ{jhHkUjZhHkUh8 hHk6OJQJh8 h\*OJQJjm4hHkUjzhHkUh\*OJQJjLhHkUj6hHkUj<hHkUjZhHkUh7 hHkOJQJh8 hHkOJQJjh7 hHkOJQJU0[R
&F7$8$Eƀ*foH$^gdHkR
&F7$8$Eƀ*foH$^gdHk6789:;=>EFGHIJ?@A]ݺʯʣݓ݃{lh8 hHkCJOJQJaJjhHkUjDhHkUjpFhHkUj1hHkUjhHkUh8 hHkH*OJQJh8 h\*OJQJj4hHkUjJhHkUh8 hHkOJQJj`hHkUjh7 hHkOJQJUh7 hHkOJQJjp1hHkU%GU
&F8d7$8$Eƀ*fH$\$^8gdHkjd7$8$H$^jgdHkR
&Fj7$8$Eƀ*f.H$^jgdHkA\]>?gdHkgdHkm$U
&F8d7$8$Eƀ*fH$\$^8gdHk] !ô}}}}#h\*B*OJQJ^JmH phsH )h\*hHkB*OJQJ^JmH phsH jhHkUjYhHkUjh7 hHkOJQJUh\*OJQJmH sH h\*hHkH*OJQJmH sH "h\*hHk6OJQJ]mH sH h\*hHkOJQJmH sH .)=>?ͼh\*hh+mH sH #h\*hHk6OJQJ^JmH sH h\*hHkOJQJ^JmH sH )h\*hHkB*OJQJ^JmH phsH h\*OJQJ]mH sH h\*hHkOJQJ]mH sH ,1h. A!"#$n%SDd
Q
SA@?#"JQ>Q
12