next up previous contents
Next: Modeling and Prediction of Up: Computational Information Processing Previous: Internet Services Modelling

Recurrent Self-Organizing Map

Researchers: Jukka Heikkonen, Timo Koskela, Markus Varsta, Jouko Lampinen and
José Millan from the Joint Research Centre of the European Commission Ispra site.


The Self-Organizing Map (SOM) by Teuvo Kohonen is one of the best known neural models with un-supervised learning. The SOM is a vector quantization algorithm trying to preserve the topological relationships between the input vectors. Topology preservation means that input vectors which are close to each other in the input space tend to be represented by nodes close to each other on the map.

While the SOM has found a lot of uses especially in visualization and dimension reduction, modifications to the SOM for processing temporally related data have been rare. In this work a simple extension to the SOM algorithm for processing temporal sequences was developed and experimented with. The developed algorithm is a modification of the Temporal Kohonen Map (TKM) algorithm. In the TKM the involvement of the earlier input vectors in each node activity is defined as a recursive difference equation which defines the current node activity as a function of the previous node activations and the new input. In the proposed recurrent SOM (RSOM) algorithm the scalar node activities of the TKM are replaced by difference vectors defined as a recursive difference equation of the new input shown to the map, the successive previous difference vectors, and the weight vectors of the units. The difference vectors can be used both in the finding of the best matching unit and, unlike in TKM based on the normal SOM learning rule, also in the weight update process when training the map.

In addition to being tested in synthetic environment (see figure 10), the RSOM algorithm has been evaluated in a study to compare neural time series prediction algorithms. Depending on the data the RSOM with proper feedback was able to slightly outperform normal SOM but the results are not yet conclusive.


  
Figure 10
Figure 10: Example of RSOM: input consisted of sequence of two-dimensional vectors [x1   x2]T, where the values of x1 and x2 were in the set {1,3,5}. When trained with random sequence of such vectors, the map learned to distinguish all the possible 81 ordered pairs of states. In the figure the units are labeled so that the first digit indicates the first vector in the pair and the second digit the latter vector. The units are drawn in the locations of the unit weight vectors.


next up previous contents
Next: Modeling and Prediction of Up: Computational Information Processing Previous: Internet Services Modelling
www@lce.hut.fi