entropy.based {FSelector}R Documentation

Entropy-based filters

Description

The algorithms find weights of discrete attributes basing on their correlation with continous class attribute.

Usage

information.gain(formula, data, unit)
gain.ratio(formula, data, unit)
symmetrical.uncertainty(formula, data, unit)

Arguments

formula

A symbolic description of a model.

data

Data to process.

unit

Unit for computing entropy (passed to entropy. Default is "log".

Details

information.gain is

H(Class) + H(Attribute) - H(Class, Attribute)

.

gain.ratio is

\frac{H(Class) + H(Attribute) - H(Class, Attribute)}{H(Attribute)}

symmetrical.uncertainty is

2\frac{H(Class) + H(Attribute) - H(Class, Attribute)}{H(Attribute) + H(Class)}

Value

a data.frame containing the worth of attributes in the first column and their names as row names

Author(s)

Piotr Romanski, Lars Kotthoff

Examples

  data(iris)

  weights <- information.gain(Species~., iris)
  print(weights)
  subset <- cutoff.k(weights, 2)
  f <- as.simple.formula(subset, "Species")
  print(f)

  weights <- information.gain(Species~., iris, unit = "log2")
  print(weights)

  weights <- gain.ratio(Species~., iris)
  print(weights)
  subset <- cutoff.k(weights, 2)
  f <- as.simple.formula(subset, "Species")
  print(f)

  weights <- symmetrical.uncertainty(Species~., iris)
  print(weights)
  subset <- cutoff.biggest.diff(weights)
  f <- as.simple.formula(subset, "Species")
  print(f)


[Package FSelector version 0.34 Index]