NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Audience
Showing all 15 results
Peer reviewed Peer reviewed
Direct linkDirect link
Smith, William C. – Practical Assessment, Research & Evaluation, 2014
The ability of regression discontinuity (RD) designs to provide an unbiased treatment effect while overcoming the ethical concerns plagued by Random Control Trials (RCTs) make it a valuable and useful approach in education evaluation. RD is the only explicitly recognized quasi-experimental approach identified by the Institute of Education…
Descriptors: Computation, Regression (Statistics), Statistical Bias, Quasiexperimental Design
Peer reviewed Peer reviewed
Direct linkDirect link
Huang, Francis L. – Practical Assessment, Research & Evaluation, 2014
Clustered data (e.g., students within schools) are often analyzed in educational research where data are naturally nested. As a consequence, multilevel modeling (MLM) has commonly been used to study the contextual or group-level (e.g., school) effects on individual outcomes. The current study investigates the use of an alternative procedure to…
Descriptors: Hierarchical Linear Modeling, Regression (Statistics), Educational Research, Sampling
Peer reviewed Peer reviewed
Direct linkDirect link
Adelson, Jill L. – Practical Assessment, Research & Evaluation, 2013
Often it is infeasible or unethical to use random assignment in educational settings to study important constructs and questions. Hence, educational research often uses observational data, such as large-scale secondary data sets and state and school district data, and quasi-experimental designs. One method of reducing selection bias in estimations…
Descriptors: Educational Research, Data, Statistical Bias, Probability
Peer reviewed Peer reviewed
Direct linkDirect link
McMillan, James H.; Venable, Jessica C.; Varier, Divya – Practical Assessment, Research & Evaluation, 2013
Kingston and Nash (2011) recently presented a meta-analysis of studies showing that the effect of formative assessment on K-12 student achievement may not be as robust as widely believed. This investigation analyzes the methodology used in the Kingston and Nash meta-analysis and provides further analyses of the studies included in the study. These…
Descriptors: Formative Evaluation, Academic Achievement, Elementary Secondary Education, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L. – Practical Assessment, Research & Evaluation, 2013
The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…
Descriptors: Factor Analysis, Educational Research, Best Practices, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Frey, Bruce B.; Schmitt, Vicki L.; Allen, Justin P. – Practical Assessment, Research & Evaluation, 2012
A commonly advocated best practice for classroom assessment is to make the assessments authentic. Authentic is often used as meaning the mirroring of real-world tasks or expectations. There is no consensus, however, in the actual definition of the term or the characteristics of an authentic classroom assessment. Sometimes, the realistic component…
Descriptors: Performance Based Assessment, Educational Research, Elementary Secondary Education, Preschool Children
Peer reviewed Peer reviewed
Direct linkDirect link
McMillan, James H.; Foley, Jennifer – Practical Assessment, Research & Evaluation, 2011
This study shows the extent to which effect size is reported and discussed in four major journals. A series of judgments about different aspects of effect size were conducted for 417 articles from four journals. Results suggest that while the reporting of simple effect size indices is more prevalent, substantive discussions of the meaning of…
Descriptors: Effect Size, Journal Articles, Periodicals, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Konstantopoulos, Spyros – Practical Assessment, Research & Evaluation, 2009
Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…
Descriptors: Social Science Research, Effect Size, Computation, Tables (Data)
Peer reviewed Peer reviewed
Direct linkDirect link
DiStefano, Christine; Zhu, Min; Mindrila, Diana – Practical Assessment, Research & Evaluation, 2009
Following an exploratory factor analysis, factor scores may be computed and used in subsequent analyses. Factor scores are composite variables which provide information about an individual's placement on the factor(s). This article discusses popular methods to create factor scores under two different classes: refined and non-refined. Strengths and…
Descriptors: Factor Structure, Factor Analysis, Researchers, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Rudner, Lawrence M., Ed.; Schafer, William D., Ed. – Practical Assessment, Research & Evaluation, 2001
This document consists of papers published in the electronic journal "Practical Assessment, Research & Evaluation" during 2000-2001: (1) "Advantages of Hierarchical Linear Modeling" (Jason W. Osborne); (2) "Prediction in Multiple Regression" (Jason W. Osborne); (3) Scoring Rubrics: What, When, and How?" (Barbara M. Moskal); (4) "Organizational…
Descriptors: Educational Assessment, Educational Research, Elementary Secondary Education, Evaluation Methods
Peer reviewed Peer reviewed
Schafer, William D. – Practical Assessment, Research & Evaluation, 2001
Suggests the routine use of replications in field studies, pointing out that it is usually possible to synthesize replications quantitatively using meta-analysis. Makes the c ase that this is especially attractive for investigators whose research paradigm choices are limited in the field environment. (SLD)
Descriptors: Educational Research, Field Studies, Meta Analysis, Research Methodology
Peer reviewed Peer reviewed
MacColl, Gail S.; White, Kathleen D. – Practical Assessment, Research & Evaluation, 1999
Describes some of the problems in communicating educational research findings to a general audience and provides helpful information on how researchers can best present data on educational practices that work and those that don't. The needs of the audience should be the primary focus in such reports. (SLD)
Descriptors: Audience Awareness, Communication (Thought Transfer), Educational Research, Information Dissemination
Rudner, Lawrence M., Ed.; Schafer, William D., Ed. – Practical Assessment, Research and Evaluation, 2001
This document consists of articles 23 through 26 published in the electronic journal "Practical Assessment, Research & Evaluation" in 2001: (23) "Effects of Removing the Time Limit on First and Second Language Intelligence Test Performance" (Jennifer Mullane and Stuart J. McKelvie); (24) "Consequences of (Mis)use of the Texas Assessment of…
Descriptors: Educational Research, Elementary Secondary Education, Essays, High Stakes Tests
Rudner, Lawrence M., Ed.; Schafer, William D., Ed. – Practical Assessment, Research and Evaluation, 2000
This document consists of articles 1 through 14 of volume 6 of "Practical Assessment, Research & Evaluation": (1) "Seven Myths about Literacy in the United States" (Jeff McQuillan); (2) "Implementing Performance Assessment in the Classroom" (Amy Brualdi); (3) "Some Evaluation Questions" (William Shadish); (4) "Item Banking" (Lawrence Rudner); (5)…
Descriptors: Educational Research, Elementary Secondary Education, Evaluation, Item Banks
Rudner, Lawrence M., Ed.; Schaefer, William D., Ed. – Practical Assessment, Research and Evaluation, 2000
This document consists of the first 10 articles of volume 8 of the electronic journal "Practical Assessment, Research & Evaluation" published in 2002-2003: (1) "Using Electronic Surveys: Advice from Survey Professionals" (David M. Shannon, Todd E. Johnson, Shelby Searcy, and Alan Lott); (2) "Four Assumptions of Multiple Regression That Researchers…
Descriptors: Bilingual Education, Bilingual Students, Cheating, Educational Research