evaluate {stream} | R Documentation |
Evaluate a Data Stream Mining Task
Description
Generic function to calculate evaluation measures for a data stream mining task DST on a data stream DSD object.
Usage
evaluate_static(object, dsd, measure, n, ...)
evaluate_stream(object, dsd, measure, n, horizon, ..., verbose = FALSE)
Arguments
object |
The DST object that the evaluation measure is being requested from. |
dsd |
The DSD object used to create the test data. |
measure |
Evaluation measure(s) to use. If missing then all available measures are returned. |
n |
The number of data points being requested. |
... |
Further arguments are passed on to the specific implementation (e.g., see evaluate.DSC) |
horizon |
Evaluation is done using horizon many previous points (see detail section). |
verbose |
Report progress? |
Details
We define two generic evaluation functions:
-
evaluate_static()
evaluates the current DST model on new data without updating the model. -
evaluate_stream()
evaluates the DST model using prequential error estimation (see Gama, Sebastiao and Rodrigues; 2013). The data points in the horizon are first used to calculate the evaluation measure and then they are used for updating the cluster model. A horizon of ' means that each point is evaluated and then used to update the model.
The available evaluation measures depend on the task. Currently available task to evaluate:
-
DSC via evaluate.DSC
Value
evaluate
returns an object of class stream_eval
which
is a numeric vector of the values of the requested measures.
Author(s)
Michael Hahsler
References
Joao Gama, Raquel Sebastiao, Pedro Pereira Rodrigues (2013). On evaluating stream learning algorithms. Machine Learning, March 2013, Volume 90, Issue 3, pp 317-346.
See Also
Other DST:
DSAggregate()
,
DSC()
,
DSClassifier()
,
DSOutlier()
,
DSRegressor()
,
DST()
,
DST_SlidingWindow()
,
DST_WriteStream()
,
predict()
,
stream_pipeline
,
update()
Other evaluation:
animate_cluster()
,
evaluate.DSC