Author
Omlin, Christian W. P.
Other Contributors
Giles, C. Lee; Rogers, Edwin H.; Ji, Chuany; Krishnamoorthy, M. S.; Walker, Ellen L.;
Date Issued
1994-12
Subject
Computer Science
Degree
PhD;
Terms of Use
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute, Troy, NY. Copyright of original work retained by author.;
Abstract
In recent years, there has been a renewed interest in artificial neural networks. The discovery of new learning algorithms has made them promising tools in diverse applications such as pattern recognition, signal processing, knowledge acquisition for expert systems, prediction of protein structures, and dynamical system modeling. Recurrent neural networks which are able to store state information over indefinite time spans are particularly well-suited for modeling dynamical systems such a stock markets, speech, and physical systems.; In this thesis, we examine how symbolic knowledge is represented in recurrent networks trained to recognize regular languages. We demonstrate methods for inserting and extracting the learned languages in the form of deterministic finite-state automata (DFAs). We focus our attention on the quality of the extracted grammatical rules and on how partial prior knowledge can be effectively utilized to improve training performance. We prove that DFAs can be naturally embedded in a special recurrent network architecture with higher order weights such that the finite-state encoding remains stable indefinitely. We investigate means for constructing neural DFAs which are tolerant to faults in their internal structure.;
Description
December 1994; School of Science
Department
Dept. of Computer Science;
Publisher
Rensselaer Polytechnic Institute, Troy, NY
Relationships
Rensselaer Theses and Dissertations Online Collection;
Access
Restricted to current Rensselaer faculty, staff and students. Access inquiries may be directed to the Rensselaer Libraries.;