NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers1
What Works Clearinghouse Rating
Does not meet standards1
Showing 1 to 15 of 36 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Luke Keele; Matthew Lenard; Lindsay Page – Journal of Research on Educational Effectiveness, 2024
In education settings, treatments are often non-randomly assigned to clusters, such as schools or classrooms, while outcomes are measured for students. This research design is called the clustered observational study (COS). We examine the consequences of common support violations in the COS context. Common support violations occur when the…
Descriptors: Intervention, Cluster Grouping, Observation, Catholic Schools
Peer reviewed Peer reviewed
Direct linkDirect link
Li, Wei; Dong, Nianbo; Maynarad, Rebecca; Spybrook, Jessaca; Kelcey, Ben – Journal of Research on Educational Effectiveness, 2023
Cluster randomized trials (CRTs) are commonly used to evaluate educational interventions, particularly their effectiveness. Recently there has been greater emphasis on using these trials to explore cost-effectiveness. However, methods for establishing the power of cluster randomized cost-effectiveness trials (CRCETs) are limited. This study…
Descriptors: Research Design, Statistical Analysis, Randomized Controlled Trials, Cost Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Shen, Zuchao; Kelcey, Benjamin – Journal of Research on Educational Effectiveness, 2022
Optimal sampling frameworks attempt to identify the most efficient sampling plans to achieve an adequate statistical power. Although such calculations are theoretical in nature, they are critical to the judicious and wise use of funding because they serve as important starting points that guide practical discussions around sampling tradeoffs and…
Descriptors: Sampling, Research Design, Randomized Controlled Trials, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Deke, John; Wei, Thomas; Kautz, Tim – Journal of Research on Educational Effectiveness, 2021
Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen characterized as "small." While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts may…
Descriptors: Intervention, Program Evaluation, Sample Size, Randomized Controlled Trials
Peer reviewed Peer reviewed
Direct linkDirect link
Stallasch, Sophie E.; Lüdtke, Oliver; Artelt, Cordula; Brunner, Martin – Journal of Research on Educational Effectiveness, 2021
To plan cluster-randomized trials with sufficient statistical power to detect intervention effects on student achievement, researchers need multilevel design parameters, including measures of between-classroom and between-school differences and the amounts of variance explained by covariates at the student, classroom, and school level. Previous…
Descriptors: Foreign Countries, Randomized Controlled Trials, Intervention, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Xu, Menglin; Logan, Jessica A. R. – Journal of Research on Educational Effectiveness, 2021
Planned missing data designs allow researchers to have highly-powered studies by testing only a fraction of the traditional sample size. In two-method measurement planned missingness designs, researchers assess only part of the sample on a high-quality expensive measure, while the entire sample is given a more inexpensive, but biased measure. The…
Descriptors: Longitudinal Studies, Research Design, Research Problems, Structural Equation Models
Peer reviewed Peer reviewed
Direct linkDirect link
Chan, Wendy; Oh, Jimin; Luo, Peihao – Journal of Research on Educational Effectiveness, 2021
Findings from experimental studies have increasingly been used to inform policy in school settings. Thus far, the populations in many of these studies are typically defined in a cross-sectional context; namely, the populations are defined in the same academic year in which the study took place or the population is defined at a fixed time point.…
Descriptors: Generalization, Research Design, Demography, Case Studies
Peer reviewed Peer reviewed
Direct linkDirect link
Westine, Carl D.; Unlu, Fatih; Taylor, Joseph; Spybrook, Jessaca; Zhang, Qi; Anderson, Brent – Journal of Research on Educational Effectiveness, 2020
Experimental research in education and training programs typically involves administering treatment to whole groups of individuals. As such, researchers rely on the estimation of design parameter values to conduct power analyses to efficiently plan their studies to detect desired effects. In this study, we present design parameter estimates from a…
Descriptors: Outcome Measures, Science Education, Mathematics Education, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Kowalski, Susan M.; Taylor, Joseph A.; Askinas, Karen M.; Wang, Qian; Zhang, Qi; Maddix, William P.; Tipton, Elizabeth – Journal of Research on Educational Effectiveness, 2020
Developing and maintaining a high-quality science teaching corps has become increasingly urgent with standards that require students to move beyond mastering facts to reasoning and arguing from evidence. "Effective" professional development (PD) for science teachers enhances teacher outcomes and, in turn, enhances primary and secondary…
Descriptors: Effect Size, Faculty Development, Science Teachers, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Bloom, Howard; Bell, Andrew; Reiman, Kayla – Journal of Research on Educational Effectiveness, 2020
This article assesses the likely generalizability of educational treatment-effect estimates from regression discontinuity designs (RDDs) when treatment assignment is based on academic pretest scores. Our assessment uses data on outcome and pretest measures from six educational experiments, ranging from preschool through high school, to estimate…
Descriptors: Data Use, Randomized Controlled Trials, Research Design, Regression (Statistics)
Peer reviewed Peer reviewed
Direct linkDirect link
Kelcey, Ben; Spybrook, Jessaca; Dong, Nianbo; Bai, Fangxing – Journal of Research on Educational Effectiveness, 2020
Professional development for teachers is regarded as one of the principal pathways through which we can understand and cultivate effective teaching and improve student outcomes. A critical component of studies that seek to improve teaching through professional development is the detailed assessment of the intermediate teacher development processes…
Descriptors: Faculty Development, Educational Research, Randomized Controlled Trials, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Wolf, Rebecca; Morrison, Jennifer; Inns, Amanda; Slavin, Robert; Risman, Kelsey – Journal of Research on Educational Effectiveness, 2020
Rigorous evidence of program effectiveness has become increasingly important with the 2015 passage of the Every Student Succeeds Act (ESSA). One question that has not yet been fully explored is whether program evaluations carried out or commissioned by developers produce larger effect sizes than evaluations conducted by independent third parties.…
Descriptors: Program Evaluation, Program Effectiveness, Effect Size, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Gersten, Russell; Haymond, Kelly; Newman-Gonchar, Rebecca; Dimino, Joseph; Jayanthi, Madhavi – Journal of Research on Educational Effectiveness, 2020
This meta-analysis systematically reviewed the most up-to-date literature to determine the effectiveness of reading interventions on measures of word and pseudoword reading, reading comprehension, and passage fluency, and to determine the role intervention and study variables play in moderating the impacts for students at risk for reading…
Descriptors: Intervention, Reading Instruction, Elementary School Students, Reading Comprehension
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Anderson, Dustin; Maynard, Rebecca – Journal of Research on Educational Effectiveness, 2019
The Society for Research on Educational Effectiveness, with funding from the Institute of Education Sciences (IES), has been working to develop and implement a registry for education studies, the Registry of Efficacy and Effectiveness Studies (REES) (https://www. sree.org/pages/registry.php). REES aims to increase transparency, rigor, and…
Descriptors: Educational Research, Databases, Research Design, Database Design
Peer reviewed Peer reviewed
Direct linkDirect link
Hedges, Larry V. – Journal of Research on Educational Effectiveness, 2018
The scientific rigor of education research has improved dramatically since the year 2000. Much of the credit for this improvement is deserved by Institute of Education Sciences (IES) policies that helped create a demand for rigorous research; increased human capital capacity to carry out such work; provided funding for the work itself; and…
Descriptors: Educational Research, Generalization, Intervention, Human Capital
Previous Page | Next Page »
Pages: 1  |  2  |  3