NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ799382
Record Type: Journal
Publication Date: 2008-Jun
Pages: 14
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0364-0213
EISSN: N/A
Large-Scale Modeling of Wordform Learning and Representation
Sibley, Daragh E.; Kello, Christopher T.; Plaut, David C.; Elman, Jeffrey L.
Cognitive Science, v32 n4 p741-754 Jun 2008
The forms of words as they appear in text and speech are central to theories and models of lexical processing. Nonetheless, current methods for simulating their learning and representation fail to approach the scale and heterogeneity of real wordform lexicons. A connectionist architecture termed the "sequence encoder" is used to learn nearly 75,000 wordform representations through exposure to strings of stress-marked phonemes or letters. First, the mechanisms and efficacy of the sequence encoder are demonstrated and shown to overcome problems with traditional slot-based codes. Then, two large-scale simulations are reported that learned to represent lexicons of either phonological or orthographic wordforms. In doing so, the models learned the statistics of their lexicons as shown by better processing of well-formed pseudowords as opposed to ill-formed (scrambled) pseudowords, and by accounting for variance in well-formedness ratings. It is discussed how the sequence encoder may be integrated into broader models of lexical processing. (Contains 4 figures.)
Lawrence Erlbaum. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals/default.html
Publication Type: Journal Articles; Reports - Descriptive
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: United States
Grant or Contract Numbers: N/A