NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 15 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Dimitrov, Dimiter M.; Atanasov, Dimitar V. – Educational and Psychological Measurement, 2021
This study presents a latent (item response theory--like) framework of a recently developed classical approach to test scoring, equating, and item analysis, referred to as "D"-scoring method. Specifically, (a) person and item parameters are estimated under an item response function model on the "D"-scale (from 0 to 1) using…
Descriptors: Scoring, Equated Scores, Item Analysis, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Dimitrov, Dimiter M. – Educational and Psychological Measurement, 2020
This study presents new models for item response functions (IRFs) in the framework of the D-scoring method (DSM) that is gaining attention in the field of educational and psychological measurement and largescale assessments. In a previous work on DSM, the IRFs of binary items were estimated using a logistic regression model (LRM). However, the LRM…
Descriptors: Item Response Theory, Scoring, True Scores, Scaling
Peer reviewed Peer reviewed
Schubert, Daniel S. P.; Fiske, Donald W. – Educational and Psychological Measurement, 1973
The conclusion of the present experiment was that prior response increases item consistency, but only when the prior experience involves the same item with the same instructions. (Authors)
Descriptors: Data Analysis, Experience, Item Analysis, Personality Measures
Peer reviewed Peer reviewed
Lord, Frederic M. – Educational and Psychological Measurement, 1971
A number of empirical studies are suggested to answer certain questions in connection with flexilevel tests. (MS)
Descriptors: Comparative Analysis, Difficulty Level, Guessing (Tests), Item Analysis
Peer reviewed Peer reviewed
Kohr, Richard L. – Educational and Psychological Measurement, 1971
Descriptors: Attitude Measures, Computer Programs, Item Analysis, Rating Scales
Peer reviewed Peer reviewed
Feldman, David H.; Markwalder, Winston – Educational and Psychological Measurement, 1971
Descriptors: Cognitive Development, Cognitive Measurement, Developmental Psychology, Item Analysis
Peer reviewed Peer reviewed
Clark, William H.; Margolis, Bruce L. – Educational and Psychological Measurement, 1971
Descriptors: Biographical Inventories, Data Analysis, Item Analysis, Scoring
Peer reviewed Peer reviewed
Sax, Gilbert – Educational and Psychological Measurement, 1996
Using various Latin square and incomplete Latin square formats, the Fields test formats provide a novel way of presenting tests to students using machine scoreable answer sheets that can be item analyzed. Items can be constructed to help students acquire knowledge or to measure the attainment of course objectives. (SLD)
Descriptors: Answer Sheets, Item Analysis, Measures (Individuals), Scoring
Peer reviewed Peer reviewed
Luecht, Richard M. – Educational and Psychological Measurement, 1987
Test Pac, a test scoring and analysis computer program for moderate-sized sample designs using dichotomous response items, performs comprehensive item analyses and multiple reliability estimates. It also performs single-facet generalizability analysis of variance, single-parameter item response theory analyses, test score reporting, and computer…
Descriptors: Computer Assisted Testing, Computer Software, Computer Software Reviews, Item Analysis
Peer reviewed Peer reviewed
Plake, Barbara S.; Melican, Gerald J. – Educational and Psychological Measurement, 1989
The impact of overall test length and difficulty on the expert judgments of item performance by the Nedelsky method were studied. Five university-level instructors predicting the performance of minimally competent candidates on a mathematics examination were fairly consistent in their assessments regardless of length or difficulty of the test.…
Descriptors: Difficulty Level, Estimation (Mathematics), Evaluators, Higher Education
Peer reviewed Peer reviewed
Maisiak, Richard; And Others – Educational and Psychological Measurement, 1979
The Test Analysis Program (TAP) is a comprehensive, flexible computer system designed to score and to analyze objective educational tests. The goals of the designers were to construct a program which would be user-oriented, flexible, and clear in structure and in output. (Author/JKS)
Descriptors: Computer Programs, Educational Testing, Item Analysis, Objective Tests
Peer reviewed Peer reviewed
Bligh, Thomas J.; Noe, Michael J. – Educational and Psychological Measurement, 1977
A computer program for scoring written simulation tests provides individual scores and basic item analysis data. The program is written in Fortran IV and can accomodate up to thirty-five hundred options and up to ten thousand examinees. (Author/JKS)
Descriptors: Computer Oriented Programs, Item Analysis, Medical Education, Problem Solving
Peer reviewed Peer reviewed
Andrulis, Richard S.; And Others – Educational and Psychological Measurement, 1978
The effects of repeaters (testees included in both administrations of two forms of a test) on the test equating process are examined. It is shown that repeaters do effect test equating and tend to lower the cutoff point for passing the test. (JKS)
Descriptors: Cutting Scores, Equated Scores, Item Analysis, Scoring
Peer reviewed Peer reviewed
Lewis, Robert A. – Educational and Psychological Measurement, 1976
Describes an improved version of the Menne program, ITEMA, which combines several analysis procedures with option capabilities. (RC)
Descriptors: Computer Programs, Item Analysis, Scoring
Peer reviewed Peer reviewed
Carroll, C. Dennis – Educational and Psychological Measurement, 1976
A computer program for item evaluation, reliability estimation, and test scoring is described. The program contains a variable format procedure allowing flexible input of responses. Achievement tests and affective scales may be analyzed. (Author)
Descriptors: Achievement Tests, Affective Measures, Computer Programs, Item Analysis