Fuzzy Finite-state Automata Can Be Deterministically Encoded into Recurrent Neural Networks
Fuzzy Finite-state Automata Can Be Deterministically Encoded into Recurrent Neural Networks
Files
Publication or External Link
Date
1998-10-15
Authors
Omlin, Christian W.
Thornber, Karvel K.
Giles, C. Lee
Advisor
Citation
DRUM DOI
Abstract
There has been an increased interest in combining
fuzzy systems with neural networks because fuzzy neural
systems merge the advantages of both paradigms.
On the one hand, parameters in fuzzy systems have clear physical meanings
and rule-based and linguistic information can be incorporated into
adaptive fuzzy systems in a systematic way. On the other hand, there
exist powerful algorithms for training various neural network
models. However, most of the proposed combined architectures
are only able to process static input-output relationships, i.e. they are
not able to process temporal input sequences of arbitrary length.
Fuzzy finite-state automata (FFAs) can model dynamical processes
whose current state depends on the current input and previous states.
Unlike in the case of deterministic finite-state automata (DFAs),
FFAs are not in one particular state, rather each state
is occupied to some degree defined by a membership function.
Based on previous work on encoding DFAs
in discrete-time, second-order recurrent neural networks,
we propose an algorithm that constructs an augmented recurrent
neural network that encodes a FFA and
recognizes a given fuzzy regular language with arbitrary accuracy.
We then empirically verify the encoding methodology by
measuring string recognition performance of recurrent neural networks which
encode large randomly generated FFAs. In particular, we examine how the
networks' performance varies as a function of synaptic weight strength.
(Also cross-referenced as UMIACS-TR-96-12)