mse {cv}R Documentation

Cost Functions for Fitted Regression Models

Description

Compute cost functions (cross-validation criteria) for fitted regression models.

Usage

mse(y, yhat)

rmse(y, yhat)

medAbsErr(y, yhat)

BayesRule(y, yhat)

BayesRule2(y, yhat)

Arguments

y

response

yhat

fitted value

Details

Cost functions (cross-validation criteria) are meant to measure lack-of-fit. Several cost functions are provided:

  1. mse() returns the mean-squared error of prediction for a numeric response variable y and predictions yhat; and rmse() returns the root-mean-squared error and is just the square-root of mse().

  2. medAbsErr() returns the median absolute error of prediction for a numeric response y and predictions yhat.

  3. BayesRule() and BayesRule2() report the proportion of incorrect predictions for a dichotomous response variable y, assumed coded (or coercible to) 0 and 1. The yhat values are predicted probabilities and are rounded to 0 or 1. The distinction between BayesRule() and BayesRule2() is that the former checks that the y values are all either 0 or 1 and that the yhat values are all between 0 and 1, while the latter doesn't and is therefore faster.

Value

In general, cost functions should return a single numeric value measuring lack-of-fit. mse() returns the mean-squared error; rmse() returns the root-mean-squared error; medAbsErr() returns the median absolute error; and BayesRule() and BayesRule2() return the proportion of misclassified cases.

Functions

See Also

cv, cv.merMod, cv.function.

Examples

data("Duncan", package="carData")
m.lm <- lm(prestige ~ income + education, data=Duncan)
mse(Duncan$prestige, fitted(m.lm))

data("Mroz", package="carData")
m.glm <- glm(lfp ~ ., data=Mroz, family=binomial)
BayesRule(Mroz$lfp == "yes", fitted(m.glm))

[Package cv version 2.0.0 Index]