NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
No Child Left Behind Act 200115
Showing 1 to 15 of 203 results Save | Export
Herman, Joan L.; La Torre, Deborah; Epstein, Scott; Wang, Jia – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2016
This report presents the results of expert panels' item-by-item analysis of the 2015 PISA Reading Literacy and Mathematics Literacy assessments and compares study findings on PISA's representation of deeper learning with that of other related studies. Results indicate that about 11% to 14% of PISA's total raw score value for reading and…
Descriptors: Achievement Tests, International Assessment, Foreign Countries, Secondary School Students
Chung, Gregory K. W. K.; Delacruz, Girlie C.; Dionne, Gary B.; Baker, Eva L.; Lee, John J.; Osmundson, Ellen – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2016
This report addresses a renewed interest in individualized instruction, driven in part by advances in technology and assessment as well as a persistent desire to increase the access, efficiency, and cost effectiveness of training and education. Using computer-based instruction we delivered extremely efficient instruction targeted to low knowledge…
Descriptors: Grade 6, Algebra, Grade 8, Individualized Instruction
Herman, Joan; Epstein, Scott; Leon, Seth; Matrundola, Deborah La Torre; Reber, Sarah; Choi, Kilchan – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2015
Kentucky has been a leader in the movement to more rigorous college and career ready standards to support their students' success in the 21st century. Kentucky was the first state to adopt new college and career ready standards (CCRS)--termed the Kentucky Core Academic Standards. Many of Kentucky's districts have moved proactively and…
Descriptors: Academic Standards, State Standards, College Readiness, Career Readiness
Herman, Joan L.; Matrundola, Deborah La Torre; Epstein, Scott; Leon, Seth; Dai, Yunyun; Reber, Sarah; Choi, Kilchan – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2015
With support from the Bill and Melinda Gates Foundation, researchers and experts in mathematics education developed the Mathematics Design Collaborative (MDC) as a strategy to support the transition to Common Core State Standards in math. MDC provides short formative assessment lessons known as Classroom Challenges for use in middle and high…
Descriptors: Grade 9, Algebra, Mathematics Instruction, Secondary School Mathematics
Herman, Joan L.; La Torre Matrundola, Deborah; Wang, Jia – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2015
This study examines the extent to which deeper learning is expected to be present in the new college and career ready (CCR) standards. This is done by examining the distribution of items and tasks at high levels of cognitive demand (DOK3 and DOK4) in the summative test blueprints developed by the Partnership for Assessment of Readiness for College…
Descriptors: Summative Evaluation, Raw Scores, Problem Solving, Thinking Skills
Herman, Joan L.; Epstein, Scott; Leon, Seth; Dai, Yunyun; La Torre Matrundola, Deborah; Reber, Sarah; Choi, Kilchan – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2015
The Bill and Melinda Gates Foundation invested in the Literacy Design Collaborative (LDC) as one strategy to support teachers' and students' transition to the Common Core State Standards (CCSS) in English language arts. This report provides an early look at the implementation of LDC in sixth-grade Advanced Reading classes in a large Florida…
Descriptors: Instructional Design, Common Core State Standards, Language Arts, Grade 6
Falk, Carl F.; Cai, Li – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2015
We present a logistic function of a monotonic polynomial with a lower asymptote, allowing additional flexibility beyond the three-parameter logistic model. We develop a maximum marginal likelihood based approach to estimate the item parameters. The new item response model is demonstrated on math assessment data from a state, and a computationally…
Descriptors: Guessing (Tests), Item Response Theory, Mathematics Instruction, Mathematics Tests
Levy, Roy – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
Digital games offer an appealing environment for assessing student proficiencies, including skills and misconceptions in a diagnostic setting. This paper proposes a dynamic Bayesian network modeling approach for observations of student performance from an educational video game. A Bayesian approach to model construction, calibration, and use in…
Descriptors: Video Games, Educational Games, Bayesian Statistics, Observation
Wang, Jia; Schweig, Jonathan D.; Herman, Joan L. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
Magnet schools are one of the largest sectors of choice schools in the United States. In this study, we explored whether there is heterogeneity in magnet school effects on student achievement by examining the effectiveness of 24 recently funded magnet schools in 5 school districts across 4 states. We used a two-step analysis: First, separate…
Descriptors: Meta Analysis, Success, School Effectiveness, Magnet Schools
Kerr, Deirdre – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
Educational video games provide an opportunity for students to interact with and explore complex representations of academic content and allow for the examination of problem-solving strategies and mistakes that can be difficult to capture in more traditional environments. However, data from such games are notoriously difficult to analyze. This…
Descriptors: Identification, Misconceptions, Scoring Rubrics, Educational Games
Chung, Gregory K. W. K.; Choi, Kilchan; Baker, Eva L.; Cai, Li – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
A large-scale randomized controlled trial tested the effects of researcher-developed learning games on a transfer measure of fractions knowledge. The measure contained items similar to standardized assessments. Thirty treatment and 29 control classrooms (~1500 students, 9 districts, 26 schools) participated in the study. Students in treatment…
Descriptors: Video Games, Educational Games, Mathematics Instruction, Mathematics
Hansen, Mark; Cai, Li; Monroe, Scott; Li, Zhen – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
It is a well-known problem in testing the fit of models to multinomial data that the full underlying contingency table will inevitably be sparse for tests of reasonable length and for realistic sample sizes. Under such conditions, full-information test statistics such as Pearson's X[superscript 2]?? and the likelihood ratio statistic…
Descriptors: Goodness of Fit, Item Response Theory, Classification, Maximum Likelihood Statistics
Monroe, Scott; Cai, Li; Choi, Kilchan – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
This research concerns a new proposal for calculating student growth percentiles (SGP, Betebenner, 2009). In Betebenner (2009), quantile regression (QR) is used to estimate the SGPs. However, measurement error in the score estimates, which always exists in practice, leads to bias in the QR-­based estimates (Shang, 2012). One way to address this…
Descriptors: Item Response Theory, Achievement Gains, Regression (Statistics), Error of Measurement
Cai, Li; Monroe, Scott – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2014
We propose a new limited-information goodness of fit test statistic C[subscript 2] for ordinal IRT models. The construction of the new statistic lies formally between the M[subscript 2] statistic of Maydeu-Olivares and Joe (2006), which utilizes first and second order marginal probabilities, and the M*[subscript 2] statistic of Cai and Hansen…
Descriptors: Item Response Theory, Models, Goodness of Fit, Probability
Kerr, Deirdre; Mousavi, Hamid; Iseli, Markus R. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2013
The Common Core assessments emphasize short essay constructed response items over multiple choice items because they are more precise measures of understanding. However, such items are too costly and time consuming to be used in national assessments unless a way is found to score them automatically. Current automatic essay scoring techniques are…
Descriptors: Automation, Scoring, Essay Tests, Natural Language Processing
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  14