NotesFAQContact Us
Search Tips
ERIC Number: ED358114
Record Type: Non-Journal
Publication Date: 1993-Apr
Pages: 16
Abstractor: N/A
Reference Count: N/A
Scoring Rubrics for Performance Tests: Lessons Learned from Job Performance Assessment in the Military.
Wise, Lauress
Industrial and organizational psychologists for the Department of Defense have been working for the past 10 years to develop high fidelity measures of job performance for use in validating job selection procedures and standards. Information on developing and scoring performance exercises in the Job Performance Measurement (JPM) Project is presented, and lessons that might be useful in education are extracted. In many ways, the task of the industrial psychologist is easier than that of the educator because of broader agreement about how the task should be performed and close alignment between training and expected performance. Tasks identified by each Armed Service were analyzed, and scoring rules were developed. The following lessons seem especially pertinent to educational assessment: (1) careful specification of the domains assessed is essential for evaluating the adequacy of any sample selected; (2) scoring elements that assess adherence to processes that are taught will have better diagnostic value (and possibly greater validity) than will those that just reflect the quality of output; (3) scoring procedures must be anchored to observable criteria; and (4) generalizability theory provides a useful framework for evaluating alternative scoring rubrics. One table lists the JPM occupational specialties, and two figures illustrate the discussion. An attachment summarizes the lessons to be learned. (SLD)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Defense Manpower Data Center, Monterey, CA.