These days more and more colleges are setting up systems that automatically email students when an algorithm determines they are academically at risk. The promise is that such small nudges can motivate recipients to get back on track and keep them from dropping out. But in some cases such efforts actually cause more harm than good.
It’s not that the underlying idea is flawed, but research is showing that how colleges implement the efforts makes a big difference. And the risk of harm is great, with the ability to unintentionally nudge people out of college.
Mark Milliron, cofounder and chief learning officer for Civitas Learning, which makes student-success technologies, told the story of one case of an early-alert system backfiring during a panel this week at the annual SXSW EDU conference, during a session called NudgeU: Learning from Behavioral Economics.
Milliron shared one example of how a student, who is a working mother, said she received early-warning messages and left school because of it.
“She said that during the month of October her two kids got the flu, and a big work project came up, and she was dealing with vomiting children and a giant work project, and her head was down for two and half weeks. When she finally came back out of it, she wanted to get caught back up with school,” Milliron said. “But she got an email from the school, from the early-warning system, saying, ‘This is to inform you that you have missed three classes, and if you miss one more you’ll be dropped.’ At that point, and after dealing with kids and work, she said, ‘This is a sign from God I’m not supposed to be in school.’ And she left. And you can understand that.”
Register now for the EdSurge webinar, “How Analytics Can Support Student Success in Higher Ed,” April 25, 2018 at 12 pm PT | 3 pm ET. Sponsored by Salesforce.org.
Milliron stressed that the early-warning systems have worked at some colleges his company studied. But at others, the nudges had an “almost double-digit negative impact.”
To prevent that kind of outcome, he encouraged colleges to watch the data closely as they implement such systems, and to test the wording of their nudges to find out how students would respond to the alerts.
Such research, he says, has shown the importance of sending messages when students have academic successes, not just when failure is on the horizon. One of the best subject lines as far as getting students to pay attention, for instance, is “we’re proud of you.” Milliron joked that he considers using the subject line now on any message he wants to be sure someone will open.
Colleges must consider the ethics of any big-data project involving students, argued another member of the panel, Ernest Ezeugo, a program associate at New America. Otherwise, a friendly nudge could end up turning into a shove.
“It takes great forethought to use this technology in a way that doesn’t harm students,” he said, noting that the algorithms can end up furthering the implicit biases of whoever builds the system. To help guide colleges through the design process, he pointed to a 2016 New America report, “The Promise and Peril of Predictive Analytics in Higher Education.”
This article is part of an upcoming EdSurge Guide exploring innovations in student success, publishing March 26. The guide is sponsored by Salesforce.org, which had no influence on this story.
Dangers of ‘One-Size-Fits-All’ Approach
Even simply giving access to their raw academic-performance data can have unintended consequences.
That’s what Stephanie Teasley found when she studied new features in learning-management systems that show students a “dashboard” of their grades and their activity (how much they’ve clicked on material, say) compared to others in the course (without giving any other student’s name). Companies like Blackboard and Canvas offer the feature, though it is up to the university or professor whether to turn such dashboards on or off.
“I think they have a lot of potential, but we can’t just assume that one size fits all,” said Teasley, in an interview. “I’m really concerned about this issue of really understanding for whom these systems are really valuable and potentially when they’re not.”
For instance, she asked, what if a female student in an engineering course is shown that she’s below average in the course—could that discourage her from pushing forward? “I worry about the stereotype threat issue,” she said.
One possible solution, she said, is to show students how they are doing relative to others with the same level of preparation they’ve had. That way a student doing slightly below average compared to the class as a whole could be shown a message that reads, “For someone who wasn’t really prepared for this course, you’re doing well.”
As her paper concludes: “While more research on student-facing dashboards is needed, it seems likely that designing systems with ‘one size fits all’’ displays—where all students’ performance is assessed by a single algorithm and they see the same format for the feedback—may be unwise.”