NotesFAQContact Us
Search Tips
ERIC Number: ED377709
Record Type: Non-Journal
Publication Date: 1994
Pages: 13
Abstractor: N/A
Reference Count: N/A
Recurrent Artificial Neural Networks and Finite State Natural Language Processing.
Moisl, Hermann
It is argued that pessimistic assessments of the adequacy of artificial neural networks (ANNs) for natural language processing (NLP) on the grounds that they have a finite state architecture are unjustified, and that their adequacy in this regard is an empirical issue. First, arguments that counter standard objections to finite state NLP on the grounds that these objections confuse the explanatory aims of linguistic theory with the essential technological aims of NLP are presented. A finite state NLP model that maps strings onto meaning representations is then proposed and preliminary string processing test and cluster analysis results of a computer simulation are presented, briefly addressing some problems and further developments. It is concluded that it remains to be seen whether the model can be developed for general NLP, but that it exemplifies the kind of radical departure from linguistics-based NLP that is possible once the supposed theoretical obstacles to finite state NLP are removed. In particular, it departs from linguistics-based NLP in making no use of any syntactic or compositional structure beyond the purely sequential and amounts to table lookup mapping from strings to meaning representations. Contains 33 references. (MSE)
Publication Type: Reports - Evaluative; Opinion Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A