calcNCE {logicDT} | R Documentation |
Calculate the normalized cross entropy
Description
This function computes the normalized cross entropy (NCE) which is given by
\mathrm{NCE} = \frac{\frac{1}{N} \sum_{i=1}^{N}
y_i \cdot \log(p_i) + (1-y_i) \cdot \log(1-p_i)}{
p \cdot \log(p) + (1-p) \cdot \log(1-p)}
where (for i \in \lbrace 1,\ldots,N \rbrace
)
y_i \in \lbrace 0,1 \rbrace
are the true classes,
p_i
are the risk/probability predictions and
p = \frac{1}{N} \sum_{i=1}^{N} y_i
is total unrestricted
empirical risk estimate.
Usage
calcNCE(preds, y)
Arguments
preds |
Numeric vector of risk estimates |
y |
Vector of true binary outcomes |
Details
Smaller values towards zero are generally prefered. A NCE of one or above would indicate that the used model yields comparable or worse predictions than the naive mean model.
Value
The normalized cross entropy
References
He, X., Pan, J., Jin, O., Xu, T., Liu, B., Xu, T., Shi, Y., Atallah, A., Herbrich, R., Bowers, S., Candela, J. Q. (2014). Practical Lessons from Predicting Clicks on Ads at Facebook. Proceedings of the Eighth International Workshop on Data Mining for Online Advertising 1-9. doi: 10.1145/2648584.2648589