next up previous contents
Next: The Adaptive Brain Interfaces Up: Computational Information Processing Previous: Recurrent Self-Organizing Map

Modeling and Prediction of Temporal Processes

Researchers: Timo Koskela, Markus Varsta, Jukka Heikkonen, and Kimmo Kaski

Temporal sequence processing (TSP) is a research area having applications in diverse fields varying from weather forecasting to time series prediction, speech recognition and remote sensing. The general goal is constructing a model that can predict the future of the measured process under interest. Various modeling techniques can be divided e.g. to those using linear models (e.g. autoregressive (AR) models) or nonlinear models (e.g. neural networks). Separation can also be made to global and local models. In global modeling approach only one model is used to characterize the measured data. Local modeling approach is based on dividing the data set to smaller sets of data, each being modeled with a simple local model. Creation of local data sets is usually carried out by some clustering or quantization algorithm, such as k-means or Self-Organizing Map (SOM). In time series prediction, the input of the model is usually provided by using a windowing technique to split the series into input vectors. Output of the model corresponds to the desired prediction. Typically input vectors contain past samples of the series up to certain length. In this procedure the temporal context between consecutive vectors is lost. One way of trying to avoid this is to include to the model memory that can store contextual information which exists between the consecutive input vectors.

In our approach, Recurrent Self-Organizing Map (RSOM) is used to store temporal context from the input vectors. Figure 11 shows a schematic picture of an RSOM unit which acts as a recurrent filter and stores context in the form of its difference vector y(n). RSOM is effectively used to ``temporal quantization'' i.e. clustering of temporal sequences of input vectors. In prediction cases RSOM is used to cluster input vectors into local data sets that correspond to certain unit. Local models, which are each associated with a certain unit in the map, are then estimated using the obtained data sets. Figure 12 shows the procedure for building the model and evaluating its prediction ability with testing data [22]. Time series is divided to training and testing data. For model selection purposes cross-validation was used. The best model according to cross-validation is trained again with the whole training data. This model is then used to predict the test data set that has not been presented to the model before.

Figure 11&12
Figure 11:   RSOM unit which acts as a recurrent filter. Figure 12:   Construction of the local models.

Convergence properties of the RSOM learning algorithm were considered in [69]. It was shown that Temporal Kohonen Map (TKM) algorithm does not converge to solution while RSOM does. In [21] RSOM was used for synthetic temporal sequence classification case and EEG based epileptic activity detection. In [22,20] RSOM with local linear models was used for four different time series prediction cases. Results in these cases suggest that RSOM can be efficiently used in temporal sequence processing.

Figure 13 shows one of the time series cases, a laser time series which consists of measurements of the intensity of an infrared laser. This series is a good example of a highly nonlinear, stationary and chaotic series. RSOM with local linear models, multilayer perceptron (MLP) and autoregressive (AR) models were used to one-step prediction. Figure 14 shows predictions made with RSOM and MLP models which are compared to actual series. RSOM model gives a good accuracy in predictions with local linear models. A global linear model (e.g. AR) cannot be used for this task. Due to the stationarity of the series, also MLP model reaches very good accuracy in prediction.

Figure 13&14
Figure 13:   Laser time series. Figure 14:   Predictions with RSOM and MLP models.

next up previous contents
Next: The Adaptive Brain Interfaces Up: Computational Information Processing Previous: Recurrent Self-Organizing Map