ERIC Number: EJ1036018
Record Type: Journal
Publication Date: 2013-Dec
Abstractor: As Provided
Reference Count: 30
Multiple Choice Questions Can Be Designed or Revised to Challenge Learners' Critical Thinking
Tractenberg, Rochelle E.; Gushta, Matthew M.; Mulroney, Susan E.; Weissinger, Peggy A.
Advances in Health Sciences Education, v18 n5 p945-961 Dec 2013
Multiple choice (MC) questions from a graduate physiology course were evaluated by cognitive-psychology (but not physiology) experts, and analyzed statistically, in order to test the independence of content expertise and cognitive complexity ratings of MC items. Integration of higher order thinking into MC exams is important, but widely known to be challenging--perhaps especially when content experts must think like novices. Expertise in the domain (content) may actually impede the creation of higher-complexity items. Three cognitive psychology experts independently rated cognitive complexity for 252 multiple-choice physiology items using a six-level cognitive complexity matrix that was synthesized from the literature. Rasch modeling estimated item difficulties. The complexity ratings and difficulty estimates were then analyzed together to determine the relative contributions (and independence) of complexity and difficulty to the likelihood of correct answers on each item. Cognitive complexity was found to be statistically independent of difficulty estimates for 88% of items. Using the complexity matrix, modifications were identified to increase some item complexities by one level, without affecting the item's difficulty. Cognitive complexity can effectively be rated by non-content experts. The six-level complexity matrix, if applied by faculty peer groups trained in cognitive complexity and without domain-specific expertise, could lead to improvements in the complexity targeted with item writing and revision. Targeting higher order thinking with MC questions "can" be achieved without changing item difficulties or other test characteristics, but this may be less likely if the content expert is left to assess items within their domain of expertise.
Descriptors: Multiple Choice Tests, Critical Thinking, Graduate Study, Physiology, Science Instruction, Statistical Analysis, Expertise, Cognitive Psychology, Test Construction, Cognitive Processes, Models, Item Response Theory
Springer. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: email@example.com; Web site: http://www.springerlink.com
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Authoring Institution: N/A