Humanizing Education's Algorithms

Diversity and Equity

Humanizing Education's Algorithms

By Junaid Mubeen     Jun 10, 2016

Humanizing Education's Algorithms

There is a human story behind every adaptive tutoring journey. During a recent school visit, I was challenged by the principal to find the story behind his star math student (who we’ll call Joshua). As I scanned through Joshua’s learning analytics, I quickly gleaned from his usage patterns that this is a student with a disciplined work ethic. His improvement was strides ahead of his peers, his status as ‘star student’ well-earned. Joshua’s engagement metrics pointed to a sincere enthusiasm for math, which was swiftly confirmed as he relayed his favorite topics in the curriculum (Fractions and Place Value). His Star Wars t-shirt only fermented my belief that Joshua felt his calling was in STEM.

I was steadfast in my analysis. But as Joshua took his departure, the principal dropped a revelation that I had overlooked. Joshua is homeless.

Damn. I wasn’t even remotely close.

It transpired that Joshua lives in a hostel, where he shares a bedroom with his mother and four siblings. Joshua has no computer to call his own so his mother takes him to the community library twice a week, where he devotes his time to the digital learning tools purchased by his school.

I was left to reflect on the role - and limitations - of data and algorithms in education.

The catch-22 of algorithmic education

As I mused over Joshua’s story, a colleague remarked that our users’ metadata – such as the IP address of their sessions – may pave the way for automatically identifying homeless students. Those students may then receive more focused tutoring paths to account for their sporadic usage patterns.

If you haven’t already guessed, the colleague in question is an engineer. He spends his days working on predictive algorithms to better target students’ individual learning needs. And I hope he won’t mind me saying that on this occasion, he was way off the mark.

Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.

There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?

So innovators appear to be in a bind.

We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.

There is only one way to resolve this trade-off: loop in the educators.

Algorithms and data must exist to serve educators

No tutoring algorithm should be based purely on interaction data. The nuts and bolts of students’ learning experiences – from the lesson they are given through to the choice of knowledge representation – should be based on proven pedagogical principles. It is vital that these principles are baked into tutoring algorithms from the start, and that engineers work closely alongside pedagogical experts throughout the creation process.

Tutoring algorithms have a natural counterpart in real-time progress reports. It is the reports that are fed to parents and educators, who are ideally positioned to uncover the story behind each student’s data. They must be empowered to do exactly that.

An algorithmic approach is not sufficient to serve our students. Joshua has met with success because his teachers are active agents in his learning journey. His progress data may act as a guide, but it is Joshua’s teachers who can interpret his data within his unique context and take the relevant actions. For instance, the reports may highlight sporadic usage patterns (and give precise meaning to terms like ‘sporadic’) but the additional support that is needed for Joshua is a judgement best left to his school.

Algorithms and data need not be the mechanical vices of data scientists – with the right intentions, they can uplift educators and amplify their efforts to meet every the needs of every student. It is the combined potential of algorithms and human insight that will win the day.

As Joshua’s story highlights, it takes a community of educators to nurture students’ learning. It’s time to dispense with false dichotomies and put tutoring algorithms in their rightful place: at the service of educators. Perhaps then our students can enjoy the fruits of personalized learning.

Junaid Mubeen, PhD, is the head of product at Whizz Education , a provider of online adaptive math tutoring for 5-13 year-olds.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up