ERIC Number: EJ1139379
Record Type: Journal
Publication Date: 2017-Apr
Abstractor: As Provided
Reference Count: 49
Semantic Coherence Facilitates Distributional Learning
Ouyang, Long; Boroditsky, Lera; Frank, Michael C.
Cognitive Science, v41 suppl 4 p855-884 Apr 2017
Computational models have shown that purely statistical knowledge about words' linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that "postman" and "mailman" are semantically similar because they have quantitatively similar patterns of association with "other" words (e.g., they both tend to occur with words like "deliver," "truck," "package"). In contrast to these computational results, artificial language learning experiments suggest that distributional statistics "alone" do not facilitate learning of linguistic categories. However, experiments in this paradigm expose participants to entirely novel words, whereas real language learners encounter input that contains some known words that are semantically organized. In three experiments, we show that (a) the presence of familiar semantic reference points facilitates distributional learning and (b) this effect crucially depends both on the presence of known words and the adherence of these known words to some semantic organization.
Descriptors: Semiotics, Computational Linguistics, Syntax, Semantics, Correlation, Language Research, Linguistic Input, Vocabulary Development, Inferences, Models, Artificial Languages, Language Acquisition, Learning Processes, Connected Discourse
Wiley-Blackwell. 350 Main Street, Malden, MA 02148. Tel: 800-835-6770; Tel: 781-388-8598; Fax: 781-388-8232; e-mail: email@example.com; Web site: http://www.wiley.com/WileyCDA
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Sponsor: National Science Foundation (NSF)
Authoring Institution: N/A
Grant or Contract Numbers: BCS1058119