mlr_learners_regr.kknn {mlr3learners}R Documentation

k-Nearest-Neighbor Regression Learner

Description

k-Nearest-Neighbor regression. Calls kknn::kknn() from package kknn.

Initial parameter values

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("regr.kknn")
lrn("regr.kknn")

Meta Information

Parameters

Id Type Default Levels Range
k integer 7 [1, \infty)
distance numeric 2 [0, \infty)
kernel character optimal rectangular, triangular, epanechnikov, biweight, triweight, cos, inv, gaussian, rank, optimal -
scale logical TRUE TRUE, FALSE -
ykernel untyped NULL -
store_model logical FALSE TRUE, FALSE -

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrKKNN

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
LearnerRegrKKNN$new()

Method clone()

The objects of this class are cloneable with this method.

Usage
LearnerRegrKKNN$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Note

There is no training step for k-NN models, just storing the training data to process it during the predict step. Therefore, ⁠$model⁠ returns a list with the following elements:

References

Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. doi:10.5282/ubm/epub.1769.

Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733–2763. doi:10.1214/12-AOS1049.

Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21–27. doi:10.1109/TIT.1967.1053964.

See Also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.multinom, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Examples

if (requireNamespace("kknn", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("regr.kknn")
print(learner)

# Define a Task
task = tsk("mtcars")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}

[Package mlr3learners version 0.7.0 Index]