NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Audience
Showing 1 to 15 of 531 results
Peer reviewed Peer reviewed
Direct linkDirect link
Attali, Yigal – Educational and Psychological Measurement, 2014
This article presents a comparative judgment approach for holistically scored constructed response tasks. In this approach, the grader rank orders (rather than rate) the quality of a small set of responses. A prior automated evaluation of responses guides both set formation and scaling of rankings. Sets are formed to have similar prior scores and…
Descriptors: Responses, Item Response Theory, Scores, Rating Scales
Peer reviewed Peer reviewed
Direct linkDirect link
Plieninger, Hansjörg; Meiser, Thorsten – Educational and Psychological Measurement, 2014
Response styles, the tendency to respond to Likert-type items irrespective of content, are a widely known threat to the reliability and validity of self-report measures. However, it is still debated how to measure and control for response styles such as extreme responding. Recently, multiprocess item response theory models have been proposed that…
Descriptors: Validity, Item Response Theory, Rating Scales, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Wiley, Edward W.; Shavelson, Richard J.; Kurpius, Amy A. – Educational and Psychological Measurement, 2014
The name "SAT" has become synonymous with college admissions testing; it has been dubbed "the gold standard." Numerous studies on its reliability and predictive validity show that the SAT predicts college performance beyond high school grade point average. Surprisingly, studies of the factorial structure of the current version…
Descriptors: College Readiness, College Admission, College Entrance Examinations, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Rutkowski, Leslie; Svetina, Dubravka – Educational and Psychological Measurement, 2014
In the field of international educational surveys, equivalence of achievement scale scores across countries has received substantial attention in the academic literature; however, only a relatively recent emphasis on scale score equivalence in nonachievement education surveys has emerged. Given the current state of research in multiple-group…
Descriptors: International Programs, Educational Assessment, Surveys, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Mylonas, Kostas; Furnham, Adrian; Divale, William; Leblebici, Cigdem; Gondim, Sonia; Moniz, Angela; Grad, Hector; Alvaro, Jose Luis; Cretu, Romeo Zeno; Filus, Ania; Boski, Pawel – Educational and Psychological Measurement, 2014
Several sources of bias can plague research data and individual assessment. When cultural groups are considered, across or even within countries, it is essential that the constructs assessed and evaluated are as free as possible from any source of bias and specifically from bias caused due to culturally specific characteristics. Employing the…
Descriptors: Test Bias, Measures (Individuals), Unemployment, Adults
Peer reviewed Peer reviewed
Direct linkDirect link
McArdle, John J.; Hamagami, Fumiaki; Bautista, Randy; Onoye, Jane; Hishinuma, Earl S.; Prescott, Carol A.; Takeshita, Junji; Zonderman, Alan B.; Johnson, Ronald C. – Educational and Psychological Measurement, 2014
In this study, we reanalyzed the classic Hawai'i Family Study of Cognition (HFSC) data using contemporary multilevel modeling techniques. We used the HFSC baseline data ("N" = 6,579) and reexamined the factorial structure of 16 cognitive variables using confirmatory (restricted) measurement models in an explicit sequence. These…
Descriptors: Factor Analysis, Hierarchical Linear Modeling, Data Analysis, Structural Equation Models
Peer reviewed Peer reviewed
Direct linkDirect link
von Eye, Alexander; Wiedermann, Wolfgang – Educational and Psychological Measurement, 2014
Approaches to determining direction of dependence in nonexperimental data are based on the relation between higher-than second-order moments on one side and correlation and regression models on the other. These approaches have experienced rapid development and are being applied in contexts such as research on partner violence, attention deficit…
Descriptors: Statistical Analysis, Factor Analysis, Structural Equation Models, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Hayduk, Leslie – Educational and Psychological Measurement, 2014
Researchers using factor analysis tend to dismiss the significant ill fit of factor models by presuming that if their factor model is close-to-fitting, it is probably close to being properly causally specified. Close fit may indeed result from a model being close to properly causally specified, but close-fitting factor models can also be seriously…
Descriptors: Factor Analysis, Goodness of Fit, Factor Structure, Structural Equation Models
Peer reviewed Peer reviewed
Direct linkDirect link
Jin, Ying; Myers, Nicholas D.; Ahn, Soyeon; Penfield, Randall D. – Educational and Psychological Measurement, 2013
The Rasch model, a member of a larger group of models within item response theory, is widely used in empirical studies. Detection of uniform differential item functioning (DIF) within the Rasch model typically employs null hypothesis testing with a concomitant consideration of effect size (e.g., signed area [SA]). Parametric equivalence between…
Descriptors: Test Bias, Effect Size, Item Response Theory, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Raykov, Tenko; Marcoulides, George A.; Millsap, Roger E. – Educational and Psychological Measurement, 2013
A multiple testing method for examining factorial invariance for latent constructs evaluated by multiple indicators in distinct populations is outlined. The procedure is based on the false discovery rate concept and multiple individual restriction tests and resolves general limitations of a popular factorial invariance testing approach. The…
Descriptors: Testing, Statistical Analysis, Factor Analysis, Statistical Significance
Peer reviewed Peer reviewed
Direct linkDirect link
Harrell-Williams, Leigh M.; Wolfe, Edward W. – Educational and Psychological Measurement, 2013
Most research on confirmatory factor analysis using information-based fit indices (Akaike information criterion [AIC], Bayesian information criteria [BIC], bias-corrected AIC [AICc], and consistent AIC [CAIC]) has used a structural equation modeling framework. Minimal research has been done concerning application of these indices to item response…
Descriptors: Correlation, Goodness of Fit, Test Length, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Wolf, Erika J.; Harrington, Kelly M.; Clark, Shaunna L.; Miller, Mark W. – Educational and Psychological Measurement, 2013
Determining sample size requirements for structural equation modeling (SEM) is a challenge often faced by investigators, peer reviewers, and grant writers. Recent years have seen a large increase in SEMs in the behavioral science literature, but consideration of sample size requirements for applied SEMs often relies on outdated rules-of-thumb.…
Descriptors: Sample Size, Structural Equation Models, Statistical Analysis, Statistical Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Liu, Yan; Zumbo, Bruno D.; Wu, Amery D. – Educational and Psychological Measurement, 2012
Previous studies have rarely examined the impact of outliers on the decisions about the number of factors to extract in an exploratory factor analysis. The few studies that have investigated this issue have arrived at contradictory conclusions regarding whether outliers inflated or deflated the number of factors extracted. By systematically…
Descriptors: Factor Analysis, Data, Simulation, Monte Carlo Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Dunn, Karee E.; Lo, Wen-Juo; Mulvenon, Sean W.; Sutcliffe, Rachel – Educational and Psychological Measurement, 2012
The Motivated Strategies for Learning Questionnaire (MSLQ) has dominated self-regulated learning research since the early 1990s. In this study, the two MSLQ subscales specifically designed to assess self-regulation--Metacognitive Self-Regulation subscale and Effort Regulation subscale--were examined. Results indicated that the structure of the two…
Descriptors: Questionnaires, Self Control, Learning Strategies, Metacognition
Peer reviewed Peer reviewed
Direct linkDirect link
Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng – Educational and Psychological Measurement, 2012
Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…
Descriptors: Reliability, Factor Analysis, Psychometrics, Item Response Theory
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  36