ERIC Number: EJ1222807
Record Type: Journal
Publication Date: 2019
Pages: 20
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-2056-4880
EISSN: N/A
Using Learning Analytics to Evaluate the Quality of Multiple-Choice Questions: A Perspective with Classical Test Theory and Item Response Theory
Azevedo, Jose Manuel; Oliveira, Ema P.; Beites, PatrĂcia Damas
International Journal of Information and Learning Technology, v36 n4 p322-341 2019
Purpose: The purpose of this paper is to find appropriate forms of analysis of multiple-choice questions (MCQ) to obtain an assessment method, as fair as possible, for the students. The authors intend to ascertain if it is possible to control the quality of the MCQ contained in a bank of questions, implemented in Moodle, presenting some evidence with Item Response Theory (IRT) and Classical Test Theory (CTT). The used techniques can be considered a type of Descriptive Learning Analytics since they allow the measurement, collection, analysis and reporting of data generated from students' assessment. Design/methodology/approach: A representative data set of students' grades from tests, randomly generated with a bank of questions implemented in Moodle, was used for analysis. The data were extracted from a Moodle database using MySQL with an ODBC connector, and collected in MS Excel[TM] worksheets, and appropriate macros programmed with VBA were used. The analysis with the CTT was done through appropriate MS Excel[TM] formulas, and the analysis with the IRT was approached with an MS Excel[TM] add-in. Findings: The Difficulty and Discrimination Indexes were calculated for all the questions having enough answers. It was found that the majority of the questions presented values for these indexes, which leads to a conclusion that they have quality. The analysis also showed that the bank of questions presents some internal consistency and, consequently, some reliability. Groups of questions with similar features were obtained, which is very important for the teacher to develop tests as fair as possible. Originality/value: The main contribution and originality that can be found in this research is the definition of groups of questions with similar features, regarding their difficulty and discrimination properties. These groups allow the identification of difficulty levels in the questions on the bank of questions, thus allowing teachers to build tests, randomly generated with Moodle, that include questions with several difficulty levels in the tests, as it should be done. As far as the authors' knowledge, there are no similar results in the literature.
Descriptors: Learning Analytics, Multiple Choice Tests, Test Theory, Item Response Theory, Test Items, Difficulty Level, College Students
Emerald Group Publishing Limited. Howard House, Wagon Lane, Bingley, West Yorkshire, BD16 1WA, UK. Tel: +44-1274-777700; Fax: +44-1274-785201; e-mail: emerald@emeraldinsight.com; Web site: http://www.emeraldinsight.com
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A