This Mathematician Brought Big Data to Advising. Then Deeper Questions...

Student Success

This Mathematician Brought Big Data to Advising. Then Deeper Questions Emerged.

By Ellen Wexler     Apr 13, 2017

This Mathematician Brought Big Data to Advising. Then Deeper Questions Emerged.
Tristan Denley, vice chancellor for academic affairs at the Tennessee Board of Regents

This article is part of the guide: Crossing the Finish Line: Stories on Student Success and What Colleges Are Doing to Get There.

As far back as he can remember, Tristan Denley wanted to be a mathematician. In college, planning his path was easy: His program was rigorously structured—and most of his choices were made for him.

But he knew that, for most students, deciding what to study—and what courses to take—is much harder. So when he moved from teaching math to working in academic affairs, he decided to use big data to help students make choices.

He realized that any human advisor offers advice, but that advice relies on the advisor’s personal experience—and no advisor can know everything.

“Every advisor has experienced this, this lack of information,” says Denley, now the vice chancellor for academic affairs at the Tennessee Board of Regents. “As a human advisor, there’s only a limited number of students that you’ve ever advised.”

The solution? Add some not-quite-human elements to the mix.

In 2011, Denley, introduced an online course-recommendation tool at Austin Peay State University, where he was serving as the provost and vice president for academic affairs at the time. Students there still meet with traditional advisors—but those students now come armed with a list of course recommendations, created just for them by the system.

The predictive analytics program, called Degree Compass, is often compared to Netflix. Just as Netflix tries to guess whether you’ll like a movie you’ve never seen before, Degree Compass tries to guess which courses you’ll be most successful in. Courses are rated on a scale of one to five stars.

But what makes a course worth recommending? Some criteria are logistical, focusing on what courses students need to satisfy their degree requirements, and what order they need to take them.

But his system goes further, even predicting what grade a student would earn in each class, so it can recommend courses that students are predicted to score well in. The predictions are based on each student’s former grades in similar courses, sometimes going back to include high school grades and standardized test scores. “It turns out to be pretty successful,” Denley says. “The estimates that we get are within about half a letter grade, on average.”

Numbers and Values

While programs like Degree Compass rely on complicated algorithms, the decisions about what to prioritize are philosophical ones: What makes one course more valuable than another? And how should we define value?

Some analysts worry that, if not used correctly, these kinds of tools could reinforce existing biases in the education system, especially when demographic data—race, ethnicity, gender or socioeconomic status—is used to predict how successful a student will be. “We know that race, ethnicity and socioeconomic status tend to be correlated with at-risk status, and so students who come from similar groups might be disproportionately flagged or marked as one, at risk, or two, not having what it takes to pursue particular majors,” says Manuela Ekowo, policy analyst at New America.

In March, Ekowo co-authored a report on the ethical use of predictive analytics in higher education. “It is crucial,” reads the report, “that predictive models and algorithms are, at the very least, created to reduce rather than amplify bias.” The concern is: If students are deemed at-risk because of characteristics they can’t change, predictive analytics tools will reinforce existing problems, rather than helping to solve them. For instance: A minority student, predicted to perform poorly in her major because of her race, could be encouraged to switch to an easier major.

But Denley, who acknowledges these limitations, does not use any demographic data in his algorithms. “The hope was that, by removing that demographic facet from the techniques, we would hopefully be able to remove those biases,” he says. “And that’s actually what we found.”

So far, the results look promising: Since Degree Compass was introduced at Austin Peay in 2011, the six-year graduation rate jumped from 33 to 37.4 percent. Traditionally disadvantaged groups saw the biggest gains: For low-income students, that number jumped from 25 to 31 percent; for black students, it jumped from 28.7 to 33.8 percent.

Austin Peay serves a significant number of first-generation students, and many of them “didn’t really know how universities and colleges worked,” Denley says. “The higher education system was a real maze.”

How, then, could the maze become easier to navigate? Denley often uses the phrase “choice architecture”—or, how a choice is presented to students. Degree Compass takes a difficult choice—picking a couple of classes out of a thick course catalog—and adds order to a chaotic process.

Denley, now at the system level, hopes to apply that thinking to other areas of student life. Selecting appropriate classes can be tough—but eventually, students face an even harder decision: picking a major. When faced with over 100 possible majors, Denley says that many students put off making a decision.

But in college, putting off picking a major can have consequences: When the Tennessee Board of Regents followed 4,470 students over a three-year period, 57 percent of the students completed all three years. But looking only at students who selected a program of study in their first year, 95 percent completed all three years. Looking only at those who remained undecided, that number drops to 29 percent.

Still, knowing that picking a major is important doesn’t make the choice any easier. So Denley, in his role at the Board of Regents, helped introduce something new on Tennessee’s campuses: something similar to a major—but not quite so daunting.

Called academic focus areas, these are broad categories—including humanities, business and STEM—and there are only a few to choose from. “Maybe you can’t choose exactly the one program out of these 150 programs,” Denley tells students. “But I bet if we sit down and talk about it, you know which one of these affinity groups or disciplines you want to study.”

And notably—even with advanced predictive technologies—Denley’s goal is to improve traditional advising, not replace it. “So often advising ends up being about scheduling,” he says. “That’s not the kind of advising which we know leads to success.”

The idea is that, when students use predictive tools, they end up with much more fodder for conversation with their advisors. “Having the data drives you to need more human advisors because it opens up conversations that previously were not had,” says Loretta Griffy, associate provost for student success at Austin Peay. “It’s the very opposite of what a lot of folks intuitively may think.”

Informed Decisions?

Denley’s philosophy is all about choice: Give each student the best information possible to make an informed, autonomous decision.

But this doesn’t mean that students are told everything. For instance, when Degree Compass ranks courses, it considers predicted grades in each class—but it doesn’t share those grades with the students.

Should students know everything? It’s one of the hardest questions to grapple with, says Iris Palmer, a senior policy analyst who co-authored the New America report. She understands both sides of the debate: On one hand, there’s the concern that students have a right to their own data. But at the same time, “students are vulnerable to suggestion, and so when you just present things in a certain way, they can get easily discouraged.”

Either way, Denley doesn’t want his tools to force a student to do anything. Predictive analytics, he says, should never be seen as a self-fulfilling prophecy. Even if a student is predicted to fail, she will never be pushed out of a particular program, or required to switch into another one.

“The technology cannot be certain,” Denley says. “I might do some calculation and discover that this particular student, they have a 1 in 100 chance of ever being a nurse. But it doesn’t mean that they have a zero chance of being a nurse—and they might be that 1 in 100.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

Crossing the Finish Line: Stories on Student Success and What Colleges Are Doing to Get There

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up