Publication Date
| In 2015 | 0 |
| Since 2014 | 1 |
| Since 2011 (last 5 years) | 4 |
| Since 2006 (last 10 years) | 6 |
| Since 1996 (last 20 years) | 6 |
Descriptor
| Test Validity | 6 |
| Evaluation Methods | 3 |
| Test Construction | 3 |
| Test Items | 3 |
| Academic Accommodations… | 2 |
| Barriers | 2 |
| Content Validity | 2 |
| Difficulty Level | 2 |
| Disabilities | 2 |
| Educational Assessment | 2 |
| More ▼ | |
Source
| Journal of Applied Testing… | 6 |
Author
| Burke, Matthew | 1 |
| Camara, Wayne | 1 |
| Cline, Frederick | 1 |
| Cook, Linda | 1 |
| Davis-Becker, Susan L. | 1 |
| Devore, Richard | 1 |
| Jiao, Hong | 1 |
| Li, Ying | 1 |
| Ling, Guangming | 1 |
| Lissitz, Robert W. | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 6 |
| Reports - Descriptive | 2 |
| Reports - Evaluative | 2 |
| Reports - Research | 2 |
Education Level
| Elementary Secondary Education | 3 |
| Elementary Education | 1 |
| Grade 4 | 1 |
| Grade 8 | 1 |
Audience
Showing all 6 results
Smith, Russell W.; Davis-Becker, Susan L.; O'Leary, Lisa S. – Journal of Applied Testing Technology, 2014
This article describes a hybrid standard setting method that combines characteristics of the Angoff (1971) and Bookmark (Mitzel, Lewis, Patz & Green, 2001) methods. The proposed approach utilizes strengths of each method while addressing weaknesses. An ordered item booklet, with items sorted based on item difficulty, is used in combination…
Descriptors: Standard Setting, Difficulty Level, Test Items, Rating Scales
Burke, Matthew; Devore, Richard; Stopek, Josh – Journal of Applied Testing Technology, 2013
This paper describes efforts to bring principled assessment design to a large-scale, high-stakes licensure examination by employing the frameworks of Assessment Engineering (AE), the Revised Bloom's Taxonomy (RBT), and Cognitive Task Analysis (CTA). The Uniform CPA Examination is practice-oriented and focuses on the skills of accounting. In…
Descriptors: Licensing Examinations (Professions), Accounting, Engineering, Test Construction
Luecht, Richard M. – Journal of Applied Testing Technology, 2013
Assessment engineering is a new way to design and implement scalable, sustainable and ideally lower-cost solutions to the complexities of designing and developing tests. It represents a merger of sorts between cognitive task modeling and engineering design principles--a merger that requires some new thinking about the nature of score scales, item…
Descriptors: Engineering, Test Construction, Test Items, Models
Li, Ying; Jiao, Hong; Lissitz, Robert W. – Journal of Applied Testing Technology, 2012
This study investigated the application of multidimensional item response theory (IRT) models to validate test structure and dimensionality. Multiple content areas or domains within a single subject often exist in large-scale achievement tests. Such areas or domains may cause multidimensionality or local item dependence, which both violate the…
Descriptors: Achievement Tests, Science Tests, Item Response Theory, Measures (Individuals)
Steinberg, Jonathan; Cline, Frederick; Ling, Guangming; Cook, Linda; Tognatta, Namrata – Journal of Applied Testing Technology, 2009
This study examines the appropriateness of a large-scale state standards-based English-Language Arts (ELA) assessment for students who are deaf or hard of hearing by comparing the internal test structures for these students to students without disabilities. The Grade 4 and 8 ELA assessments were analyzed via a series of parcel-level exploratory…
Descriptors: Test Bias, Language Arts, State Standards, Partial Hearing
Camara, Wayne – Journal of Applied Testing Technology, 2009
The five papers in this special issue of the "Journal of Applied Testing Technology" address fundamental issues of validity when tests are modified or accommodations are provided to English Language Learners (ELL) or students with disabilities. Three papers employed differential item functioning (DIF) and factor analysis and found the underlying…
Descriptors: Second Language Learning, Factor Analysis, English (Second Language), Cognitive Ability

Peer reviewed
Direct link
