NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1168457
Record Type: Journal
Publication Date: 2017
Pages: 11
Abstractor: As Provided
ISSN: EISSN-2331-186X
Using Reliability and Item Analysis to Evaluate a Teacher-Developed Test in Educational Measurement and Evaluation
Quaigrain, Kennedy; Arhin, Ato Kwamina
Cogent Education, v4 n1 Article 1301013 2017
Item analysis is essential in improving items which will be used again in later tests; it can also be used to eliminate misleading items in a test. The study focused on item and test quality and explored the relationship between difficulty index (p-value) and discrimination index (DI) with distractor efficiency (DE). The study was conducted among 247 first-year students pursuing Diploma in Education at Cape Coast Polytechnic. Fifty multiple-choice questions were administered as an end of semester examination in Educational Measurement course. Internal consistency reliability of the test was 0.77 using Kuder-Richardson 20 coefficient (KR-20). The mean score was 29.23 with a standard deviation of 6.36. Mean difficulty index (p) value and DI were 58.46% (SD 21.23%) and 0.22 (SD 0.17), respectively. DI was noted to be a maximum at a p-value range between 40 and 60%. Mean DE was 55.04% (SD 24.09%). Items having average difficulty and high discriminating power with functional distractors should be integrated into future tests to improve the quality of the assessment. Using DI, it was observed that 30 (60%) of the test items fell into the reasonably good or acceptable value ranges.
Cogent OA. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Ghana
Grant or Contract Numbers: N/A