NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1388570
Record Type: Journal
Publication Date: 2023-Sep
Pages: 22
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1560-4292
EISSN: EISSN-1560-4306
How to Optimize Student Learning Using Student Models That Adapt Rapidly to Individual Differences
Eglington, Luke G.; Pavlik, Philip I., Jr.
International Journal of Artificial Intelligence in Education, v33 n3 p497-518 Sep 2023
An important component of many Adaptive Instructional Systems (AIS) is a 'Learner Model' intended to track student learning and predict future performance. Predictions from learner models are frequently used in combination with mastery criterion decision rules to make pedagogical decisions. Important aspects of learner models, such as learning rate and item difficulty, can be estimated from prior data. A critical function of AIS is to have students practice new content once the AIS predicts that they have 'mastered' current content or learned it to some criterion. For making this prediction, individual student parameters (e.g., for learning rate) are frequently unavailable due to having no prior data about a student, and thus population-level parameters or rules-of-thumb are typically applied instead. In this paper, we will argue and demonstrate via simulation and data analysis that even in best-case scenarios, learner models assuming equal learning rates for students will inevitably lead to systematic errors that result in suboptimal pedagogical decisions for "most" learners. This finding leads us to conclude that systematic errors should be expected, and mechanisms to adjust predictions to account for them should be included in AIS. We introduce two solutions that can adjust for student differences "online" in a running system: one that tracks systemic errors of the learner model (not the student) and adjusts accordingly, and a student-level performance adaptive feature. We demonstrate these solutions' efficacy and practicality on six large educational datasets and show that these features improved model accuracy in all tested datasets. [For the corresponding Grantee Submission, see ED629726.]
Springer. Available from: Springer Nature. One New York Plaza, Suite 4600, New York, NY 10004. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-460-1700; e-mail: customerservice@springernature.com; Web site: https://link.springer.com/
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305A190448