Ack! This page is woefully outdated; not enough time to keep up with presenting research results — I’ll update this page soon, in the meantime, here’s a few projects I’ve been working on, and contact me if you have questions or would like more detailed findings than I’ve provided here –
1) Patterns of Persistence: What Engages Students in a Remedial English Writing MOOC?
John Whitmer, Ed.D (California State University), Eva Schiorring (RP Group), Steve Miley, Pat James (Mt. San Jacinto)
Poster presented at Learning and Knowledge Analytics Conference 2014. Poster
This study sought to identify different types of participation, early predictors of high participation, and learning outcomes in a remedial writing English Massive Open Online Course (MOOC) created by Mt. San Jacinto College on the Coursera platform titled: “Crafting an Effective Writer: Tools of the Trade”. This study investigated the second offering of the course in September 2013, which had a total enrollment of 48,174 learners from countries around the globe. This research was supported through the MOOC Research Initiative by Athabasca University with funding from the Gates Foundation.
2) Improving SDSU Student Achievement by Data Driven Faculty Interventions from Learning Technologies [in progress, results summer 2014]
John Whitmer and Bernie Dodge, Co-Directors
This project will apply Learning Analytics methods to identify students atrisk of not succeeding in two high enrollment courses with historically low pass rates at SDSU: PSY 101 and STAT 119. Specifically, weekly reports will be run about student use of online materials and grades in the course to identify students atrisk of not succeeding. With input from instructors, a research assistant will provide students with targeted suggestions that could improve their performance. An experimental design will be used with half of the students randomly assigned to receive extra emailed suggestions. All other aspects of the courses will be identical. Analysis of the relationship between students identified for intervention and several dependent variables (grade, motivation, and subject knowledge) will shed light on the a)accuracy of these atrisk reports in predicting student grade (in the control group) and b) impact of the instructor intervention (in the experimental group). A precourse survey on student motivation (Keller, 2010) and prior subject matter knowledge will also be conducted, and students will be asked to maintain weekly logs of their activity online and offline. Regression analyses, incorporating feature selection methods to account for student demographic data, will be used to compare the impact of the interventions between the control and experimental groups. There is low risk to students from this research; these interventions are currently performed by faculty, albeit not on a weekly basis, and students will be provided with the opportunity to optout of the study through the informed consent procedure. Student privacy will be maintained by anonymizing student identifiers before the data is provided to researchers. The ultimate goal of this study is to improve the course pass rate at SDSU by identifying accurate predictors of low performance and creating interventions that improve student success.
3) Logging on to Improve Achievement: Using Learning Analytics to Explore Relationships between Use of the Learning Management System, Student Characteristics, and Academic Achievement in a Hybrid Large Enrollment Undergraduate Course
Academic technologies, such as the Learning Management System (LMS), have been proposed as a means to implement instructional reforms to improve student persistence. This study conducts a quantitative analysis of a large enrollment (n = 377) hybrid undergraduate course in which LMS activities replaced one of the weekly meeting sessions. Separate student LMS use variables were found to explain over four times the variation in final grade compared to student characteristic variables. Combined LMS use resulted in a model that explained 25% of the variation in final grade, which was increased to 35% with the addition of student characteristic variables. For at-risk students, LMS use had a 25% smaller effect on final grade. These results suggest that the LMS is a potentially valuable resource to carry out instructional reforms. Further, data from the LMS can be used as a meaningful indicator of student educational effort.
Summaries and Presentations of Findings:
- Research Summary (updated 6/20/2013): Logging on for Acheivement
- Powerpoint with major findings and study design (updated 2/19/2013): Learning Analytics Research Study
- Archived presentation of findings to Learning and Knowledge Analytics MOOC (from 2/19/2013): Presentation of Research Findings
- Educause published an article with early findings and research design: Analytics in Progress: Technology Use, Student Characteristics, and Student Achievement.
Sound interesting? The complete dissertation is posted here: Logging on to Improve Achievement