ERIC Number: ED483385
Record Type: Non-Journal
Publication Date: 2004-Jan
Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.
Levy, Roy; Mislevy, Robert J.
US Department of Education
The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating cognitive models in such situations, in terms of both statistical machinery and actual instrument development. The method taps the knowledge of experts to provide initial estimates for the probabilistic relationships among the variables in a multivariate latent variable model and refines these estimates using Markov chain Monte Carlo (MCMC) procedures. This process is illustrated in the context of NetPASS, a complex simulation-based assessment in the domain of computer networking. The paper describes a parameterization of the relationships in NetPASS via an ordered polytomous item response model and details the updating of the model with observed data via Bayesian statistical procedures ultimately being provided by Markov chain Monte Carlo estimation.
Descriptors: Probability, Markov Processes, Monte Carlo Methods, Bayesian Statistics, Evaluation Methods, Models, Statistical Inference, Cognitive Development, Scores, Student Evaluation, Computer Simulation
Center for the Study of Evaluation (CSE), National Center for Research on Evaluation, Standards, and Student Testing (CRESST), Graduate School of Education & Information Studies, University of California, Los Angeles, Los Angeles, CA 90095-1522. Tel: 310-206-1532.
Publication Type: Reports - Evaluative
Education Level: N/A
Sponsor: Institute of Education Sciences (ED), Washington, DC.
Authoring Institution: Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA.