NotesFAQContact Us
Search Tips
ERIC Number: ED545953
Record Type: Non-Journal
Publication Date: 2003-Apr
Pages: 47
Abstractor: ERIC
Reference Count: 5
NAEP Validity Studies: An Investigation of Why Students Do Not Respond to Questions. Working Paper No. 2003-12
Jakwerth, Pamela R.; Stancavage, Frances B.; Reed, Ellen D.
National Center for Education Statistics
Over the past decade, developers of the National Assessment of Educational Progress (NAEP) have changed substantially the mix of item types on the NAEP assessments by decreasing the numbers of multiple-choice questions and increasing the numbers of questions requiring short- or extended-constructed responses. These changes have been motivated largely by efforts to encompass the more complex learning outcomes being codified by new curriculum and assessment standards in a number of subject areas. That is, NAEP has attempted to align with widely endorsed recommendations for greater focus on the development and use of higher-order-thinking skills in instruction as well as assessments that better allow students to demonstrate such skills. With the inclusion of short and extended constructed-response questions on the NAEP assessments, however, researchers have begun to notice unacceptably high student nonresponse rates (Koretz et al. 1993). As a result, NAEP reports, analyses, and subsequent conclusions may be potentially confounded by the fact that large numbers of students are not answering some of the questions. Additionally, nonresponse rates seem to vary with student characteristics like gender and race, which may further impact the validity of NAEP conclusions. In this study, the authors explored potential reasons behind student omission of responses to assessment questions. Understanding why students fail to answer certain questions may help inform the proper treatment of missing data during the estimation of item parameters and achievement distributions. It may also help test developers identify strategies for increasing response rates for particular types of questions or for particular groups of students. The study was exploratory, small in scope, and qualitative in nature. The general approach was to visit schools where the 1998 eighth-grade national NAEP assessments in reading and civics were being conducted and interview samples of students about their test taking behaviors and their reasons for not answering particular questions following the assessment sessions. In their interviews the authors also attempted to determine whether the students could have correctly answered the questions they had left blank. Five research questions guided this study: (1) What are the reasons students give for not answering questions on the eighth-grade reading and civics NAEP assessments? (2) Are students leaving particular types of questions unanswered on these two assessments more often than other types? (3) How valid is the assumption that students have attempted, and then passed over, questions left blank in the middle of an item block, but that students have not attempted questions left blank at the end of an item block? (4) How valid is the assumption that if students skip items in the middle of an item block, they do not know the answers? and (5) What modifications can be made to NAEP assessments to decrease the numbers of questions left unanswered? [Commissioned by the NAEP Validity Studies (NVS) Panel (formed by the American Institutes for Research under contract with the National Center for Education Statistics), March 1999.]
National Center for Education Statistics. Available from: ED Pubs. P.O. Box 1398, Jessup, MD 20794-1398. Tel: 877-433-7827; Web site:
Publication Type: Reports - Research
Education Level: Grade 8; Junior High Schools; Middle Schools; Elementary Education; Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: National Center for Education Statistics (ED)
Identifiers - Location: California; Virginia
Identifiers - Assessments and Surveys: National Assessment of Educational Progress
IES Funded: Yes