artmap {RSNNS} | R Documentation |
Create and train an artmap network
Description
An ARTMAP performs supervised learning. It consists of two coupled ART networks.
In theory, these could be ART1, ART2, or others. However, in SNNS ARTMAP is
implemented for ART1 only. So, this function is to be used with binary input.
As explained in the description of art1
, ART aims at solving the stability/plasticity
dilemma. So the advantage of ARTMAP is that it is a supervised learning mechanism
that guarantees stability.
Usage
artmap(x, ...)
## Default S3 method:
artmap(
x,
nInputsTrain,
nInputsTargets,
nUnitsRecLayerTrain,
nUnitsRecLayerTargets,
maxit = 1,
nRowInputsTrain = 1,
nRowInputsTargets = 1,
nRowUnitsRecLayerTrain = 1,
nRowUnitsRecLayerTargets = 1,
initFunc = "ARTMAP_Weights",
initFuncParams = c(1, 1, 1, 1, 0),
learnFunc = "ARTMAP",
learnFuncParams = c(0.8, 1, 1, 0, 0),
updateFunc = "ARTMAP_Stable",
updateFuncParams = c(0.8, 1, 1, 0, 0),
shufflePatterns = TRUE,
...
)
Arguments
x |
a matrix with training inputs and targets for the network |
... |
additional function parameters (currently not used) |
nInputsTrain |
the number of columns of the matrix that are training input |
nInputsTargets |
the number of columns that are target values |
nUnitsRecLayerTrain |
number of units in the recognition layer of the training data ART network |
nUnitsRecLayerTargets |
number of units in the recognition layer of the target data ART network |
maxit |
maximum of iterations to perform |
nRowInputsTrain |
number of rows the training input units are to be organized in (only for visualization purposes of the net in the original SNNS software) |
nRowInputsTargets |
same, but for the target value input units |
nRowUnitsRecLayerTrain |
same, but for the recognition layer of the training data ART network |
nRowUnitsRecLayerTargets |
same, but for the recognition layer of the target data ART network |
initFunc |
the initialization function to use |
initFuncParams |
the parameters for the initialization function |
learnFunc |
the learning function to use |
learnFuncParams |
the parameters for the learning function |
updateFunc |
the update function to use |
updateFuncParams |
the parameters for the update function |
shufflePatterns |
should the patterns be shuffled? |
Details
See also the details section of art1
. The two ART1 networks are connected by a map field.
The input of the first ART1 network is the training input, the input of the second network are the target values,
the teacher signals. The two networks are often called ARTa and ARTb, we call them here training data network
and target data network.
In analogy to the ART1 and ART2 implementations, there are one initialization function, one learning function, and two update functions present that are suitable for ARTMAP. The parameters are basically as in ART1, but for two networks. The learning function and the update functions have 3 parameters, the vigilance parameters of the two ART1 networks and an additional vigilance parameter for inter ART reset control. The initialization function has four parameters, two for every ART1 network.
A detailed description of the theory and the parameters is available from the SNNS documentation and the other referenced literature.
Value
an rsnns
object. The fitted.values
member of the object contains a
list of two-dimensional activation patterns.
References
Carpenter, G. A.; Grossberg, S. & Reynolds, J. H. (1991), 'ARTMAP: Supervised real-time learning and classification of nonstationary data by a self-organizing neural network', Neural Networks 4(5), 565–588.
Grossberg, S. (1988), Adaptive pattern classification and universal recoding. I.: parallel development and coding of neural feature detectors, MIT Press, Cambridge, MA, USA, chapter I, pp. 243–258.
Herrmann, K.-U. (1992), 'ART – Adaptive Resonance Theory – Architekturen, Implementierung und Anwendung', Master's thesis, IPVR, University of Stuttgart. (in German)
Zell, A. et al. (1998), 'SNNS Stuttgart Neural Network Simulator User Manual, Version 4.2', IPVR, University of Stuttgart and WSI, University of Tübingen. https://www.ra.cs.uni-tuebingen.de/SNNS/welcome.html
Zell, A. (1994), Simulation Neuronaler Netze, Addison-Wesley. (in German)
See Also
Examples
## Not run: demo(artmap_letters)
## Not run: demo(artmap_lettersSnnsR)
data(snnsData)
trainData <- snnsData$artmap_train.pat
testData <- snnsData$artmap_test.pat
model <- artmap(trainData, nInputsTrain=70, nInputsTargets=5,
nUnitsRecLayerTrain=50, nUnitsRecLayerTargets=26)
model$fitted.values
predict(model, testData)