Can Analyzing Clicks in Digital Systems Predict Which Students Are...

Higher Education

Can Analyzing Clicks in Digital Systems Predict Which Students Are Struggling? It Depends.

By Jeffrey R. Young     Oct 14, 2022

Can Analyzing Clicks in Digital Systems Predict Which Students Are Struggling? It Depends.

Now that so much college work is done digitally in learning management systems, many colleges are trying to analyze student data from those platforms to predict which students need help. But the practice is so new that it’s not yet clear how well the approach actually works.

Can big data from the LMS predict success in a class?

That’s the question tackled by a research paper published this week. And the mixed results surprised the scholars working on it.

The researchers focused on data from community colleges in Virginia since 2000. They set up two prediction systems to guess whether students would complete a course: one based only on “administrative data” held by the colleges, such as high school GPA and college transcripts of students; and the other based on so-called “clickstream” data generated by students as they went through course activities on the LMS.

They found that for students new to the college, the LMS data did help make predictions about student success, and significantly better than predictions made with the administrative data alone. But for students who had been in the college previously, the LMS data added little value over the administrative data.

“Where we see the most value in predicting students' performance is within these students where it’s their first term of college,” says Kelli Bird, an assistant professor of education at the University of Virginia and the lead author on the study. That might be because community colleges are open access, and so colleges don’t know much about students when they come in.

Meanwhile, collecting and working with the LMS data takes significant time and effort. The experienced coders on the UVa project spent hundreds of hours on it, Bird says, since the systems generate so much data. “Anytime the student logs on to the learning platform there’s a data point for that,” she adds. “The raw data from the LMS is very very large, very difficult to store, very computationally intensive to work with, so we wanted to see how much benefit that LMS data adds to the predictions.”

Once a college sets up a system to work with LMS data, crunching the numbers in future years would take less time, Bird acknowledges. And the researchers shared their code free online, linked from their paper, to aid those who want to build on what they’ve learned.

How to Nudge

Bird is the research director in a lab at UVa dedicated to investigating using student data to predict success and to develop relevant interventions. It’s called the Nudge4 Solutions Lab.

“Can we predict student performance? We know that we can pretty well, which is great,” says Bird. “But there’s a lot to do in between predicting student performance and generating better student outcomes.”

One popular way for colleges to use such predictive analytics is to send text or email messages to students the system identifies, suggesting that they, say, seek tutoring or study harder. But some have raised questions about those efforts, which when done crudely can discourage students so much that they drop out, and may disproportionately discourage students who are first-generation or who don’t feel as welcome in a college environment.

Bird says her lab is looking into different approaches, such as giving results of prediction systems to advisors or professors to let them decide if and how to intervene. They’re also considering studying whether giving financial incentives to students to go to tutoring or other study-help resources can help with completion rates.

Patsy D. Moskal, director of Digital Learning Impact Evaluation at University of Central Florida, says her team is also in the process of learning how to benefit from analysis of LMS data. They’re not using a prediction system, but they are trying to build dashboards for professors that can help them see patterns or better notice students who might be struggling.

“It is a work in progress,” she told EdSurge in an email interview. “Not surprisingly, courses that are online or blended use the LMS more heavily than those that are fully face-to-face. In addition, courses that have few graded assignments have little data on which to base a prediction until weeks into the semester, making early intervention difficult.”

And it takes time and effort to build the dashboards and teach professors to use them, she added: “This is still an area needing significant research, and we hope that some of the lessons we learn can not only help our advising/coaching, but also provide some information that helps with quality course design.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up