Stable Encoding of Large Finite-State Automata in Recurrent Neural Networks with Sigmoid Discriminants

Loading...
Thumbnail Image

Files

CS-TR-3337.ps (277.33 KB)
No. of downloads: 213
CS-TR-3337.pdf (247.81 KB)
No. of downloads: 874

Publication or External Link

Date

1998-10-15

Advisor

Citation

DRUM DOI

Abstract

We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationship between the weight strength and the maximum allowed network size which guarantees finite-state behavior of the constructed network.

We illustrate the method by encoding random DFAs with 10, 100, and 1,000 states. While the theory predicts that the weight strength scales with the DFA size, we find the weight strength to be almost constant for all the experiments. These results can be explained by noting that the generated DFAs represent average cases. We empirically demonstrate the existence of extreme DFAs for which the weight strength scales with DFA size. (Also cross-referenced as UMIACS-TR-94-101)

Notes

Rights