NotesFAQContact Us
Collection
Advanced
Search Tips
ERIC Number: ED512656
Record Type: Non-Journal
Publication Date: 2010-Aug
Pages: 17
Abstractor: As Provided
Reference Count: 10
ISBN: N/A
ISSN: N/A
Automatic Assessment of Complex Task Performance in Games and Simulations. CRESST Report 775
Iseli, Markus R.; Koenig, Alan D.; Lee, John J.; Wainess, Richard
National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
Assessment of complex task performance is crucial to evaluating personnel in critical job functions such as Navy damage control operations aboard ships. Games and simulations can be instrumental in this process, as they can present a broad range of complex scenarios without involving harm to people or property. However, "automatic" performance assessment of complex tasks is challenging, because it involves the modeling and understanding of how experts think when presented with a series of observed in-game actions. When assessing performance, human expert scoring can be limiting, as it depends on subjective observations of in-game player's performance, which in turn is used to interpret their mastery of key associated cognitive constructs. We introduce a computational framework that incorporates the automatic performance assessment of complex tasks or action sequences as well as the modeling of real-world, simulated, or cognitive processes by modeling player actions, simulation states and events, conditional simulation state transitions, and cognitive construct dependencies using a dynamic Bayesian network. This novel approach combines a state-space model along with a probabilistic framework of Bayesian statistics, which allows us to draw probabilistic inferences about a player's decision-making abilities. Through this process, a comparison of human expert scoring and dynamic Bayesian network scoring is presented. The use of the computational framework using a dynamic Bayesian network presented in this report can help reduce or eliminate the need for human raters and decrease the time to score. This has the benefit of potentially reducing costs. In addition, it can facilitate the efficient aggregation, standardization, and reporting of the scores. (Contains 3 tables and 5 figures.)
National Center for Research on Evaluation, Standards, and Student Testing (CRESST). 300 Charles E Young Drive N, GSE&IS Building 3rd Floor, Mailbox 951522, Los Angeles, CA 90095-1522. Tel: 310-206-1532; Fax: 310-825-3883; Web site: http://www.cresst.org
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Office of Naval Research, Arlington, VA.
Authoring Institution: National Center for Research on Evaluation, Standards, and Student Testing