Noisy Time Series Prediction using Symbolic Representation and
Recurrent Neural Network Grammatical Inference
Noisy Time Series Prediction using Symbolic Representation and
Recurrent Neural Network Grammatical Inference
Loading...
Files
Publication or External Link
Date
1998-10-15
Authors
Lawrence, Steve
Tsoi, Ah Chung
Giles, C. Lee
Advisor
Citation
DRUM DOI
Abstract
Financial forecasting is an example of a signal processing problem
which is challenging due to small sample sizes, high noise,
non-stationarity, and non-linearity. Neural networks have been very
successful in a number of signal processing applications. We discuss
fundamental limitations and inherent difficulties when using neural
networks for the processing of high noise, small sample size signals.
We introduce a new intelligent signal processing method which
addresses the difficulties. The method uses conversion into a symbolic
representation with a self-organizing map, and grammatical inference
with recurrent neural networks. We apply the method to the prediction
of daily foreign exchange rates, addressing difficulties with
non-stationarity, overfitting, and unequal a priori class
probabilities, and we find significant predictability in comprehensive
experiments covering 5 different foreign exchange rates. The method
correctly predicts the direction of change for the next day with an
error rate of 47.1%. The error rate reduces to around 40% when
rejecting examples where the system has low confidence in its
prediction. The symbolic representation aids the extraction of
symbolic knowledge from the recurrent neural networks in the form of
deterministic finite state automata. These automata explain the
operation of the system and are often relatively simple. Rules related
to well known behavior such as trend following and mean reversal are
extracted.
Also cross-referenced as UMIACS-TR-96-27