betatree {betareg} | R Documentation |
Beta Regression Trees
Description
Fit beta regression trees via model-based recursive partitioning.
Usage
betatree(formula, partition,
data, subset = NULL, na.action = na.omit, weights, offset, cluster,
link = "logit", link.phi = "log", control = betareg.control(),
...)
Arguments
formula |
symbolic description of the model of type |
partition |
symbolic description of the partitioning variables,
e.g., |
data , subset , na.action , weights , offset , cluster |
arguments controlling
data/model processing passed to |
link |
character specification of the link function in
the mean model (mu). Currently, |
link.phi |
character specification of the link function in
the precision model (phi). Currently, |
control |
a list of control arguments for the beta regression specified via
|
... |
further control arguments for the recursive partitioning
passed to |
Details
Beta regression trees are an application of model-based recursive partitioning
(implemented in mob
, see Zeileis et al. 2008) to
beta regression (implemented in betareg
, see Cribari-Neto
and Zeileis 2010). See also Grün at al. (2012) for more details.
Various methods are provided for "betatree"
objects, most of them
inherit their behavior from "mob"
objects (e.g., print
, summary
,
coef
, etc.). The plot
method employs the node_bivplot
panel-generating function.
Value
betatree()
returns an object of S3 class "betatree"
which
inherits from "modelparty"
.
References
Cribari-Neto, F., and Zeileis, A. (2010). Beta Regression in R. Journal of Statistical Software, 34(2), 1–24. doi:10.18637/jss.v034.i02
Grün, B., Kosmidis, I., and Zeileis, A. (2012). Extended Beta Regression in R: Shaken, Stirred, Mixed, and Partitioned. Journal of Statistical Software, 48(11), 1–25. doi:10.18637/jss.v048.i11
Zeileis, A., Hothorn, T., and Hornik K. (2008). Model-Based Recursive Partitioning. Journal of Computational and Graphical Statistics, 17(2), 492–514.
See Also
Examples
options(digits = 4)
suppressWarnings(RNGversion("3.5.0"))
## data with two groups of dyslexic and non-dyslexic children
data("ReadingSkills", package = "betareg")
## additional random noise (not associated with reading scores)
set.seed(1071)
ReadingSkills$x1 <- rnorm(nrow(ReadingSkills))
ReadingSkills$x2 <- runif(nrow(ReadingSkills))
ReadingSkills$x3 <- factor(rnorm(nrow(ReadingSkills)) > 0)
## fit beta regression tree: in each node
## - accurcay's mean and precision depends on iq
## - partitioning is done by dyslexia and the noise variables x1, x2, x3
## only dyslexia is correctly selected for splitting
bt <- betatree(accuracy ~ iq | iq, ~ dyslexia + x1 + x2 + x3,
data = ReadingSkills, minsize = 10)
plot(bt)
## inspect result
coef(bt)
if(require("strucchange")) sctest(bt)
## IGNORE_RDIFF_BEGIN
summary(bt, node = 2)
summary(bt, node = 3)
## IGNORE_RDIFF_END
## add a numerical variable with relevant information for splitting
ReadingSkills$x4 <- rnorm(nrow(ReadingSkills), c(-1.5, 1.5)[ReadingSkills$dyslexia])
bt2 <- betatree(accuracy ~ iq | iq, ~ x1 + x2 + x3 + x4,
data = ReadingSkills, minsize = 10)
plot(bt2)
## inspect result
coef(bt2)
if(require("strucchange")) sctest(bt2)
## IGNORE_RDIFF_BEGIN
summary(bt2, node = 2)
summary(bt2, node = 3)
## IGNORE_RDIFF_END