NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Audience
Showing 1 to 15 of 95 results
Peer reviewed Peer reviewed
Direct linkDirect link
Levy, Roy – Educational Assessment, 2013
This article characterizes the advances, opportunities, and challenges for psychometrics of simulation-based assessments through a lens that views assessment as evidentiary reasoning. Simulation-based tasks offer the prospect for student experiences that differ from traditional assessment. Such tasks may be used to support evidentiary arguments…
Descriptors: Simulation, Student Evaluation, Psychometrics, Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Correnti, Richard; Matsumura, Lindsay Clare; Hamilton, Laura S.; Wang, Elaine – Educational Assessment, 2012
Guided by evidence that teachers contribute to student achievement outcomes, researchers have been reexamining how to study instruction and the classroom opportunities teachers create for students. We describe our experience measuring students' opportunities to develop analytic, text-based writing skills. Utilizing multiple methods of data…
Descriptors: Writing Skills, Skill Development, Educational Opportunities, Educational Quality
Peer reviewed Peer reviewed
Direct linkDirect link
Taut, Sandy; Santelices, Maria Veronica; Stecher, Brian – Educational Assessment, 2012
The task of validating a teacher assessment and improvement system is similar whether the system operates in the United States or in another country. Chile has a national teacher evaluation system (NTES) that is standards based, uses multiple instruments, and is intended to serve both formative and summative purposes. For the past 6 years the…
Descriptors: Evidence, Foreign Countries, Teacher Evaluation, Standards
Peer reviewed Peer reviewed
Direct linkDirect link
Riggan, Matthew; Olah, Leslie Nabors – Educational Assessment, 2011
Promising research on the teaching and learning impact of classroom-embedded formative assessment has spawned interest in a broader array of assessment tools and practices, including interim assessment. Although researchers have begun to explore the impact of interim assessments in the classroom, like other assessment tools and practices, they…
Descriptors: Homework, Student Evaluation, Observation, Formative Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Taylor, Catherine S.; Lee, Yoonsun – Educational Assessment, 2011
This article presents a study of ethnic Differential Item Functioning (DIF) for 4th-, 7th-, and 10th-grade reading items on a state criterion-referenced achievement test. The tests, administered 1997 to 2001, were composed of multiple-choice and constructed-response items. Item performance by focal groups (i.e., students from Asian/Pacific Island,…
Descriptors: Test Bias, Test Items, Pacific Islanders, American Indians
Peer reviewed Peer reviewed
Direct linkDirect link
Anderson, Daniel; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald – Educational Assessment, 2011
Students with disabilities participate in two major measurement systems. The Individuals with Disabilities Education Act emphasizes working within a Response to Intervention (RTI) framework to identify and monitor the progress of low-performing students. Persistent low-performing students also may be eligible for some form of an alternate…
Descriptors: Curriculum Based Assessment, Alternative Assessment, Learning Disabilities, Legislation
Peer reviewed Peer reviewed
Direct linkDirect link
Furtak, Erin Marie; Hardy, Ilonca; Beinbrech, Christina; Shavelson, Richard J.; Shemwell, Jonathan T. – Educational Assessment, 2010
This article adapts the Evidence-Based Reasoning (EBR) Framework (Brown, Furtak, Timms, Nagashima, & Wilson, this issue) to create a coding system for assessing argumentation in science classroom discourse. The instrument, "Evidence-Based Reasoning in Science Classroom Discourse", is intended to provide a means for measuring the quality of EBR in…
Descriptors: Science Education, Logical Thinking, Thinking Skills, Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Brown, Nathaniel J. S.; Nagashima, Sam O.; Fu, Alice; Timms, Michael; Wilson, Mark – Educational Assessment, 2010
The Evidence-Based Reasoning Assessment System (EBRAS) brings together advances in modeling scientific reasoning and assessment design to guide the development of written assessment items that target, disentangle, and elicit evidence of the multiple proficiencies underlying scientific argumentation. In this study, the EBRAS was used to assess the…
Descriptors: Measures (Individuals), Evidence, Persuasive Discourse, Logical Thinking
Peer reviewed Peer reviewed
Direct linkDirect link
Brown, Nathaniel J. S.; Furtak, Erin Marie; Timms, Michael; Nagashima, Sam O.; Wilson, Mark – Educational Assessment, 2010
Recent science education reforms have emphasized the importance of students engaging with and reasoning from evidence to develop scientific explanations. A number of studies have created frameworks based on Toulmin's (1958/2003) argument pattern, whereas others have developed systems for assessing the quality of students' reasoning to support…
Descriptors: Science Education, Logical Thinking, Thinking Skills, Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Karelitz, Tzur M.; Parrish, Deborah Montgomery; Yamada, Hiroyuki; Wilson, Mark – Educational Assessment, 2010
Assessment systems that track children's progress across time need to be sensitive to the variegated nature of development. Although instruments are commonly designed to assess behaviors within a specific age range, some children advance slower or faster than others and, as a result, often show behaviors from a younger or older age group. This…
Descriptors: Age Groups, Inferences, Test Validity, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Wise, Steven L.; DeMars, Christine E. – Educational Assessment, 2010
Educational program assessment studies often use data from low-stakes tests to provide evidence of program quality. The validity of scores from such tests, however, is potentially threatened by examinee noneffort. This study investigated the extent to which one type of noneffort--rapid-guessing behavior--distorted the results from three types of…
Descriptors: Validity, Program Evaluation, Guessing (Tests), Motivation
Peer reviewed Peer reviewed
Direct linkDirect link
Petridou, Alexandra; Williams, Julian – Educational Assessment, 2010
The person-fit literature assumes that aberrant response patterns could be a sign of person mismeasurement, but this assumption has rarely, if ever, been empirically investigated before. We explore the validity of test responses and measures of 10-year-old examinees whose response patterns on a commercial standardized paper-and-pencil mathematics…
Descriptors: Validity, Measurement, Response Style (Tests), Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Do-Hong; Huynh, Huynh – Educational Assessment, 2010
This study investigated whether scores obtained from the online and paper-and-pencil administrations of the statewide end-of-course English test were equivalent for students with and without disabilities. Score comparability was evaluated by examining equivalence of factor structure (measurement invariance) and differential item and bundle…
Descriptors: Computer Assisted Testing, Language Tests, English, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Young, John W.; Steinberg, Jonathan; Cline, Fred; Stone, Elizabeth; Martiniello, Maria; Ling, Guangming; Cho, Yeonsuk – Educational Assessment, 2010
To date, assessment validity research on non-native English speaking students in the United States has focused exclusively on those who are presently English language learners (ELLs). However, little, if any, research has been conducted on two other sizable groups of language minority students: (a) bilingual or multilingual students who were…
Descriptors: Test Validity, English (Second Language), Multilingualism, Bilingualism
Peer reviewed Peer reviewed
Direct linkDirect link
Liu, Ou Lydia; Lee, Hee-Sun; Linn, Marcia C. – Educational Assessment, 2010
To improve student science achievement in the United States we need inquiry-based instruction that promotes coherent understanding and assessments that are aligned with the instruction. Instead, current textbooks often offer fragmented ideas and most assessments only tap recall of details. In this study we implemented 10 inquiry-based science…
Descriptors: Inquiry, Active Learning, Science Achievement, Science Instruction
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7