Download full text
Download full text
ERIC Number: EJ1179661
Record Type: Journal
Publication Date: 2018-Mar
Abstractor: As Provided
Comparing the Factors That Predict Completion and Grades among For-Credit and Open/MOOC Students in Online Learning
Almeda, Ma. Victoria; Zuech, Joshua; Utz, Chris; Higgins, Greg; Reynolds, Rob; Baker, Ryan S.
Online Learning, v22 n1 p1-18 Mar 2018
Online education continues to become an increasingly prominent part of higher education, but many students struggle in distance courses. For this reason, there has been considerable interest in predicting which students will succeed in online courses and which will receive poor grades or drop out prior to completion. Effective intervention depends on understanding which students are at risk in terms of actionable factors, and behavior within an online course is one key potential factor for intervention. In recent years, many have suggested that Massive Online Open Courses (MOOCs) are a particularly useful place to conduct research into behavior and interventions, given both their size and the relatively low consequences and costs of experimentation. However, it is not yet clear whether the same factors are associated with student success in open courses, such as MOOCs, as in for-credit courses--an important consideration before transferring research results between these two contexts. While there has been considerable research in each context, differences between course design and population limit our ability to know how broadly findings generalize; differences between studies may have nothing to do with whether students are taking a course for credit or as a MOOC. Do learners behave the same way in MOOCs and for-credit courses? Are the implications for learning different, even for the same behaviors? In this paper, we study these issues through developing models that predict student course success from online interactions, in an online learning platform that caters to both distinct student groups (i.e., students who enroll on a for-credit or a noncredit basis). Our findings indicate that our models perform well enough to predict students' course grades for new students across both of our populations. Furthermore, models trained on one of the two populations were able to generalize to new students in the other student population. We find that features related to comments were good predictors of student grades for both groups. Models generated from this research can now be used by instructors and course designers to identify at-risk students both for-credit and MOOC learners, with an eye toward providing both groups with better support.
Descriptors: Performance Factors, Online Courses, Electronic Learning, Models, Noncredit Courses, College Credits, Comparative Analysis, Pass Fail Grading, Predictive Validity, Predictive Measurement, Regression (Statistics), Generalization, Educational Technology, College Students, Academic Persistence, Statistical Analysis
Online Learning Consortium, Inc. P.O. Box 1238, Newburyport, MA 01950. Tel: 888-898-6209; Fax: 888-898-6209; e-mail: email@example.com; Web site: http://onlinelearningconsortium.org/read/online-learning-journal/
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education
Authoring Institution: N/A
Grant or Contract Numbers: N/A