NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing 1 to 15 of 20 results
Peer reviewed Peer reviewed
Direct linkDirect link
Willoughby, Michael; Holochwost, Steven J.; Blanton, Zane E.; Blair, Clancy B. – Measurement: Interdisciplinary Research and Perspectives, 2014
The primary objective of this article was to critically evaluate the routine use of confirmatory factor analysis (CFA) for representing an individual's performance across a battery of executive function tasks. A conceptual review and statistical reanalysis of N = 10 studies that used CFA methods of EF tasks was undertaken. Despite evidence of…
Descriptors: Executive Function, Cognitive Measurement, Factor Analysis, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Eid, Michael; Koch, Tobias – Measurement: Interdisciplinary Research and Perspectives, 2014
Higher-order factor analysis is a widely used approach for analyzing the structure of a multidimensional test. Whenever first-order factors are correlated researchers are tempted to apply a higher-order factor model. But is this reasonable? What do the higher-order factors measure? What is their meaning? Willoughby, Holochwost, Blanton, and Blair…
Descriptors: Factor Analysis, Measurement, Theories, Executive Function
Peer reviewed Peer reviewed
Direct linkDirect link
Wiebe, Sandra A.; McFall, G. Peggy – Measurement: Interdisciplinary Research and Perspectives, 2014
Since Miyake and his colleagues (2000) published their seminal paper on the use of confirmatory factor analysis (CFA) to parse executive function (EF), CFA methods have become ubiquitous in EF research. In their interesting and thoughtful Focus article, "Executive Function: Formative Versus Reflective Measurement," Willoughby and…
Descriptors: Executive Function, Cognitive Measurement, Factor Analysis, Individual Development
Peer reviewed Peer reviewed
Direct linkDirect link
Peterson, Eric; Welsh, Marilyn C. – Measurement: Interdisciplinary Research and Perspectives, 2014
Research into executive functioning (EF) has indeed grown exponentially across the past few decades, but as the Willoughby et al. critique makes clear, there remain fundamental questions to be resolved. The crux of their argument is built upon an examination of the confirmatory factor analysis (CFA) approach to understanding executive processes.…
Descriptors: Executive Function, Measurement, Factor Analysis, Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Edwards, Michael C. – Measurement: Interdisciplinary Research and Perspectives, 2013
This author has had the privilege of knowing Professor Maydeu-Olivares for almost a decade and although their paths cross only occasionally, such instances were always enjoyable and enlightening. Edwards states that Maydeu-Olivares' target article for this issue, ("Goodness-of-Fit Assessment of Item Response Theory Models") provides…
Descriptors: Goodness of Fit, Item Response Theory, Models, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Maydeu-Olivares, Alberto – Measurement: Interdisciplinary Research and Perspectives, 2013
The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…
Descriptors: Goodness of Fit, Item Response Theory, Models, Maximum Likelihood Statistics
Peer reviewed Peer reviewed
Direct linkDirect link
Haig, Brian D. – Measurement: Interdisciplinary Research and Perspectives, 2012
Lee Cronbach once expressed the view that all roads lead to construct validity. In looking to clarify the consensus definition of validity, and its place in assessment, Newton is also led to the troublesome idea of construct validity. To be sure, he addresses other validity issues, but in this commentary, I will restrict my attention to construct…
Descriptors: Validity, Educational Assessment, Construct Validity, Definitions
Peer reviewed Peer reviewed
Direct linkDirect link
Raykov, Tenko – Measurement: Interdisciplinary Research and Perspectives, 2010
Mroch, Suh, Kane, & Ripkey (2009); Suh, Mroch, Kane, & Ripkey (2009); and Kane, Mroch, Suh, & Ripkey (2009) provided elucidating discussions on critical properties of linear equating methods under the nonequivalent groups with anchor test (NEAT) design. In this popular equating design, two test forms are administered to different groups of…
Descriptors: Equated Scores, Test Items, Factor Analysis, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Henson, Robert A. – Measurement: Interdisciplinary Research and Perspectives, 2009
The paper by Drs. Rupp and Templin provides a much needed step toward the general application of diagnostic classification modeling (DCMs). The authors have provided a summary of many of the concepts that one must consider to properly apply a DCM (which ranges from model selection and estimation, to assessing the appropriateness of the model using…
Descriptors: Classification, Psychometrics, Evaluation, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Maris, Gunter; Bechger, Timo – Measurement: Interdisciplinary Research and Perspectives, 2009
Rupp and Templin (2008) do a good job at describing the ever expanding landscape of Diagnostic Classification Models (DCM). In many ways, their review article clearly points to some of the questions that need to be answered before DCMs can become part of the psychometric practitioners toolkit. Apart from the issues mentioned in this article that…
Descriptors: Factor Analysis, Classification, Psychometrics, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Widaman, Keith F.; Grimm, Kevin J. – Measurement: Interdisciplinary Research and Perspectives, 2009
Nesselroade, Gerstorf, Hardy, and Ram developed a new and interesting way to enforce invariance at the second-order level in P-technique models, while allowing first-order structure to stray from invariance. We discuss our concerns with this approach under the headings of falsifiability, the nature of manifest variables included in models, and…
Descriptors: Factor Structure, Models, Factor Analysis, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Nesselroade, John R.; Ram, Nilam; Gerstorf, Denis; Hardy, Sam A. – Measurement: Interdisciplinary Research and Perspectives, 2009
This article presents the authors' response which consists of three main parts. The first involves recapping the general thrust of their focus article. The second part consists of some general points that they hope will clarify issues raised by the commentators that were not made as clearly as they should have been in the focus article. The third…
Descriptors: Structural Equation Models, Evaluation, Factor Analysis, Statistics
Peer reviewed Peer reviewed
Direct linkDirect link
Rupp, Andre A.; Templin, Jonathan L. – Measurement: Interdisciplinary Research and Perspectives, 2008
"Diagnostic classification models" (DCM) are frequently promoted by psychometricians as important modelling alternatives for analyzing response data in situations where multivariate classifications of respondents are made on the basis of multiple postulated latent skills. In this review paper, a definitional boundary of the space of DCM is…
Descriptors: Factor Analysis, Classification, Item Response Theory, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Schilling, Stephen G. – Measurement: Interdisciplinary Research and Perspectives, 2007
In this paper the author examines the role of item response theory (IRT), particularly multidimensional item response theory (MIRT) in test validation from a validity argument perspective. The author provides justification for several structural assumptions and interpretations, taking care to describe the role he believes they should play in any…
Descriptors: Mathematics Education, Test Validity, Factor Analysis, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Nesselroade, John R.; Gerstorf, Denis; Hardy, Sam A.; Ram, Nilam – Measurement: Interdisciplinary Research and Perspectives, 2007
Ideally, the unit of analysis in psychology is the individual. However, many psychological methods do not cope well, either at the level of construct definition or at the level of measurement, with individuality in behavior. There is little leeway for constructs to be both idiosyncratically tailored to the individual, and still identified as…
Descriptors: Psychology, Behavior, Factor Analysis, Individual Differences
Previous Page | Next Page ยป
Pages: 1  |  2