NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ746261
Record Type: Journal
Publication Date: 2006-Nov
Pages: 14
Abstractor: Author
ISBN: N/A
ISSN: ISSN-0749-596X
EISSN: N/A
Available Date: N/A
Differentiating the Differentiation Models: A Comparison of the Retrieving Effectively from Memory Model (REM) and the Subjective Likelihood Model (SLiM)
Criss, Amy H.; McClelland, James L.
Journal of Memory and Language, v55 n4 p447-460 Nov 2006
The subjective likelihood model [SLiM; McClelland, J. L., & Chappell, M. (1998). Familiarity breeds differentiation: a subjective-likelihood approach to the effects of experience in recognition memory. "Psychological Review," 105(4), 734-760.] and the retrieving effectively from memory model [REM; Shiffrin, R. M., & Steyvers, M. (1997). A model for recognition memory: REM--Retrieving effectively from memory. "Psychonomic Bulletin & Review," 4, 145-166.] are often considered indistinguishable models. Indeed both share core assumptions including a Bayesian decision process and differentiation during encoding. We give a brief tutorial on each model and conduct simulations showing cases where they diverge. The first two simulations show that for foils that are similar to a studied item, REM predicts higher false alarms rates than SLiM. Thus REM is not able to account for certain associative recognition data without using emergent features to represent pairs. Without this assumption, rearranged pairs have too strong an effect. In contrast, this assumption is not required by SLiM. The third simulation shows that SLiM predicts a reversal in the low frequency hit rate advantage as a function of study time. This prediction is tested and confirmed in an experiment.
Elsevier. 6277 Sea Harbor Drive, Orlando, FL 32887-4800. Tel: 877-839-7126; Tel: 407-345-4020; Fax: 407-363-1354; e-mail: usjcs@elsevier.com; Web site: http://www.elsevier.com.
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A