Silicon Valley’s Pursuit of Proof

Efficacy

Silicon Valley’s Pursuit of Proof

By Christina Quattrocchi     Mar 4, 2015

Silicon Valley’s Pursuit of Proof

This article is part of the collection: The EdSurge Guide to Choosing, Vetting and Purchasing K-12 Edtech Products.

When the Silicon Valley Education Foundation (SVEF) began hosting the iHub Pitch Games in 2014 in partnership with the NewSchools Venture Fund, the goals were simple: Find a way to bring teachers and developers of edtech together to share feedback.

The games featured a pitch competition, where local business leaders and educators help select tools to piloted in the schools. Companies were matched with four to five teachers, who gave feedback on the products over three months.

Now after working with more than 50 teachers and 10 companies, SVEF is trying to move from helping companies get feedback to actually demonstrating “efficacy.”

Here’s what SVEF has learned from its three cohorts.

During the first round in Spring 2014, the iHub focused on picking “late, early stage” companies including Blendspace, LearnBop and Zaption. These companies had solid products but were looking to expand their user footprints. But these companies were mature enough such that the feedback they received from teachers didn’t have as big of an impact as intended. “We found they were great at the pitch games, but once they implemented pilots the teacher feedback couldn’t influence much of their product development,” says Muhammed Chaudhry, CEO of SVEF.

By the second round, which ran in the Fall 2014, the iHub choose companies that were considered “early-early stage,” such as Mosa Mack--and consequently were more receptive to feedback. “The teachers could influence these companies significantly,” says Chaudhry, “but they are also crazier bets” in terms of the stability of their business models.

In the Fall 2014 round, the iHub used a product evaluation rubric developed by WestEd, a San Francisco-based research and evaluation organization. Metrics examined included how satisfied students and teachers were with the product, how easy they were to incorporate in the classroom, and whether students achieved the intended objective of the lesson.

Now, for the third round of iHub participants announced on Feb. 26, judges selected the following six companies from a pool of 40 applicants:

Over the next three months, these companies and the 25 teachers piloting the products will go through SVEF’s codified feedback process. The foundation facilitates four to five in-person meetings with a cohort of around 25 teachers piloting tools in their classrooms. They check on how pilot implementations are going, what’s working and what’s not. Then, teachers meet with product developers individually, spending 10 to 30 hours giving feedback using the WestEd protocol.

As the iHub moves these companies through its codified feedback process, it will look to capture more than just feedback. “In the past we were seeing if our model for gathering teachers and developers together to give structured feedback worked. Now we want to try to get at this question of efficacy more,” says Anita Lin, iHub Manager at SVEF.

But understanding efficacy can be like aiming at a moving target. “You have to be really careful about getting the right measurement, with the right intent," notes Neal Finkelstein, a WestEd associate director and senior research scientist. For instance, if a tool aims to drive student achievement by getting students "excited" about science, then it doesn’t make sense to run a pre- and post-test.

A more nuanced way to measure efficacy is to judge how well a product works with a select group of students (rather than with everyone.) Games, for instance, may be more effective with one student subgroup than another.

“Moving forward, our goal is to focus on teacher and student satisfaction, how engaged they are with the product, how easily teachers can integrate into classrooms, and how the product drives student achievement,” says the iHub's Lin.

Other regional innovation hubs across the country, notably Chicago’s LEAP Innovations and New York City’s iZone, are also exploring ways to refine a feedback cycle involving companies and local educators. Even Pearson has reorganized its approach to product development and hopes to be able to demonstrate the effectiveness of its tools by 2018.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

The EdSurge Guide to Choosing, Vetting and Purchasing K-12 Edtech Products

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up