mse {cv} | R Documentation |
Cost Functions for Fitted Regression Models
Description
Compute cost functions (cross-validation criteria) for fitted regression models.
Usage
mse(y, yhat)
rmse(y, yhat)
medAbsErr(y, yhat)
BayesRule(y, yhat)
BayesRule2(y, yhat)
Arguments
y |
response |
yhat |
fitted value |
Details
Cost functions (cross-validation criteria) are meant to measure lack-of-fit. Several cost functions are provided:
-
mse()
returns the mean-squared error of prediction for a numeric response variabley
and predictionsyhat
; andrmse()
returns the root-mean-squared error and is just the square-root ofmse()
. -
medAbsErr()
returns the median absolute error of prediction for a numeric responsey
and predictionsyhat
. -
BayesRule()
andBayesRule2()
report the proportion of incorrect predictions for a dichotomous response variabley
, assumed coded (or coercible to)0
and1
. Theyhat
values are predicted probabilities and are rounded to 0 or 1. The distinction betweenBayesRule()
andBayesRule2()
is that the former checks that they
values are all either0
or1
and that theyhat
values are all between 0 and 1, while the latter doesn't and is therefore faster.
Value
In general, cost functions should return a single numeric
value measuring lack-of-fit. mse()
returns the mean-squared error;
rmse()
returns the root-mean-squared error;
medAbsErr()
returns the median absolute error;
and BayesRule()
and
BayesRule2()
return the proportion of misclassified cases.
Functions
-
mse()
: Mean-square error. -
rmse()
: Root-mean-square error. -
medAbsErr()
: Median absolute error. -
BayesRule()
: Bayes Rule for a binary response. -
BayesRule2()
: Bayes rule for a binary response (without bounds checking).
See Also
Examples
data("Duncan", package="carData")
m.lm <- lm(prestige ~ income + education, data=Duncan)
mse(Duncan$prestige, fitted(m.lm))
data("Mroz", package="carData")
m.glm <- glm(lfp ~ ., data=Mroz, family=binomial)
BayesRule(Mroz$lfp == "yes", fitted(m.glm))