Believe and You Can Achieve? Researchers Find Limited Gains From Growth...

Learning Research

Believe and You Can Achieve? Researchers Find Limited Gains From Growth Mindset Interventions

By Jenny Abamu     May 29, 2018

Believe and You Can Achieve? Researchers Find Limited Gains From Growth Mindset Interventions

Despite all the promise surrounding “growth mindsets”—the idea that encourages students to see intelligence as something that can be nurtured and developed, as opposed to something that is fixed and innate—researchers are sounding the alarm bell. They say the intervention, at least as currently applied in today’s classrooms, isn’t shifting the needle on academic achievement.

So why do so many studies tout the transformative power of developing a growth mindset? And what factors need to be in place for these interventions to work? Professors from Case Western Reserve University took on these questions in a meta-analysis published last week in the Psychological Science Association Journal.

The two-part meta-analysis reviewed over 229 studies on growth mindset research and synthesized the data. The first study examined the correlation between growth mindset interventions and academic achievement on standardized tests. The second looked at the effectiveness of specific interventions, noting which teaching strategies showed the most impact on student outcomes.

Growth mindset enthusiasts may be disappointed to learn that although the authors found a significant correlation between growth mindset interventions and student academic achievement, the effect size was very small. Other education interventions, such as reducing class sizes or increased teacher training, had much larger effects.

“We looked at the growth mindset interventions to try to see what the overall effectiveness of them are. And here we get a very tiny effect,” explained Brooke Macnamara, an assistant professor at Case Western Reserve University and a co-author of the study, in an interview with EdSurge. “The effect was 0.08. To put that in perspective, a typical education intervention effect is 0.57. So again, this was significant but very, very small.”

Seeing that the correlation was limited, researchers took the next step—trying to identify where growth mindset interventions may be more effective, either for certain types of students or under certain conditions.

Their analysis found that age, the length, and type of intervention (meaning whether it was reading a simple article on changing intelligence, or doing a class activity) were not significant factors. However, they did see benefits for students coming from a low socioeconomic background. They also found a small effect for high-risk students, who failed a class or were at risk of dropping out.

But Macnamara notes that the achievement shown for low-income and high-risk students is both limited in statistical significance and in the number of studies covering these specific populations.

“In both of those cases, there was not a lot of information that went into that group. In other words, there were not very many studies that had examined those types of students,” Macnamara explains. “So both of those facts actually need to be interpreted with caution.”

Where Does All the Hype for Growth Mindsets Come From?

In light of these findings suggesting limited benefits from growth mindset interventions, researchers also asked, 'where does all the hype for the idea come from?' As part of their study, Macnamara and her co-authors looked at which papers on the topic were published and which ones were not. They wanted to learn if ‘publication bias’ skewed the outcomes.

“Publication bias is something that’s really important to investigate when you conduct a meta-analysis,” says Macnamara. “There is often a bias where studies that show a strong effect, or studies that show a positive effect in a direction that’s expected or desired, are more likely to be published than studies that find no difference, or what’s called a ‘null effect.’”

To learn whether their meta-analysis could have been skewed because of bias, the team reached out to several outside researchers on forums and through email to learn why their work was not published and to gather data from their research. The results showed that the data from their first analysis (on the impact of growth mindset interventions on test results) might be impacted by unpublished studies, but the second analysis (on the effectiveness of specific interventions) showed no effect from possible missing studies.

“We were actually surprised that the publication bias analyses suggested that we weren’t missing anything [in the second analysis]. The reason I say we were surprised is because we knew of studies that we were missing,” explained Macnamara, noting that she spoke to researchers who declined to share their unpublished data with the team.

Macnamara says that in addition to publication bias, media bias also plays a role in how people perceive the effects of certain teaching strategies. She cites instances where reporters contacted her to write about her research, only to retract when they learned that the findings are not what they expected. From her perspective, research showing small or null effects do not garner as much attention, in the form of citations or media mentions, the same way more grandiose outcomes do. This might be what happened with growth mindset studies.

“Studies that are especially exciting or shows especially large effect often are the ones that are cited over and over again, so even if you’re just reading the published literature you tend to get a sense that these effects perhaps are very large,” says Macnamara. “Aggregating synthesized data gets you a very different picture.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up