Can Technology Measure How You 'Show Your Work'?

Learning Research

Can Technology Measure How You 'Show Your Work'?

By Eric Horowitz     Jul 7, 2014

Can Technology Measure How You 'Show Your Work'?

If the bane of your schooling years was the phrase, “show your work,” you’re probably not alone. The requirement to demonstrate how you arrived at your solution often led to the laborious process of transforming your jumble of chicken scratches into a series of coherent steps.

But though they may seem arbitrary to a 10-year-old, rules about showing work serve an important purpose: They give a teacher a glimpse at the metacognitive processes guiding a student’s attempt to solve a problem. An added or omitted step can reveal where a student’s planning or analysis went off course.

One of the purported benefits of educational technology is that it can enhance the ease with which complex metacognitive processes can be documented and understood. Specifically, computers can use a logfile to produce “on-line” metacognition measures while the student is learning--for example, keystroke logs or mouse clicks--rather than “off-line” measures produced after the learning has taken place (for example, self-reports or interviews). Not only does this provide an in-the-moment look inside a student’s head, but it also prevents situations where a student might generate incomplete or inaccurate explanations for their behavior.

But the question remains: Can technology track student metacognition as effectively as standard human techniques? Is the data generated by logfiles meaningful?

Research led by Marcel Veenman of Leiden University suggests that the answer is yes. In a new study recently published in Learning and Individual Differences, Veenman and his colleagues tracked a group of 52 middle school students as they worked on a computer program designed to teach how five different factors, such as pollution or food sources, affected a population of otters. The students could adjust the values of each of the five factors and then run a simulation or “experiment” that revealed how the otter population changed. Students conducted a minimum of 15 experiments, after which they completed a short test on how each factor affected the otters.

As students conducted the experiments a logfile tracked various aspects of their work, each of which would serve as a measure of metacognitive activity. These included, but were not limited to, the total number of experiments performed, the elapsed time between seeing the results of one experiment and the start of the next experiment, the frequency of scrolling down to see earlier experiments or up to see later experiments, and the number of factors that were changed between experiments.

After the students completed their experiments, human judges were shown the logfiles of student activity, and they rated each student on two metacognitive indicators, systematic variation and completeness of experimentations. The first indicator addressed the extent that students looked for systematic patterns, for example, by repeatedly changing one factor while holding other factors constant. The second indicator addressed the extent to which students experimented with all five factors.

Veenman and his team were chiefly interested in two things: 1) the extent to which the computer and human measures of metacognitive activity were aligned with one another, and 2) the extent to which the two different measures of metacognitive activity predicted learning. If the human and computer measures were aligned and predicted learning, it would provide evidence that computers can match humans in identifying important metacognitive activity.

In fact, the researchers found that there was not much of a difference between the human and computer measures of metacognitive activity. Nearly all the individual computer measures (e.g. scrolling frequency, elapsed time, etc.) were correlated with both the human “systematic” measure and the human “completeness” measure. Furthermore, when the data was combined into just two measures--one computer and one human--the correlation between them was .92, which implies that the two measures had 84% of variance in common. This finding meshes with a 2012 study by Veenman that found a similar positive relationship between metacognitive activity measured by logfiles and metacognitive activity measured by having students go through a “think-aloud” protocol in which they described why they were doing what they were doing.

The computer measures of metacognitive activity also proved to be a better predictor of student performance on the post-activity test. Specifically, the computer logfile measures accounted for 41% of the variance in test performance, while the human ratings accounted for just 27% of the variance. Because metacognitive activity is associated with better learning, this suggests that the logfile measures outperformed the humans when it came to identifying true metacognitive activity. The 41% also matches the results from a 2008 review by Veenman in which he found that human measures of metacognition, such as think aloud protocols and third party observations, accounted for about 40% of the variance in learning performance.

Veenman’s results are far from conclusive. But overall they suggest that computers are able to do an adequate job measuring the kinds of planning, strategizing, and evaluation that make up an important part of learning. This means that the reams of data blending learning programs can collect--mouse clicks, page views, deleted responses, etc.--are not only useful for determining a student’s overall skill level, they can also help shed light on the murky web of strategies students use when they encounter a complex problem. We’re still a long way from a perfect understanding of how pieces of logfile data can identify student strengths and weaknesses, but as more students have their decisions and calculations stored electronically, a clearer picture will begin to emerge.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up