Deciding which tools to use in the classroom is no easy task. To make matters more complicated, there isn’t a perfect catalogue of every edtech product full with reviews, product details and exact specifications where school leaders can quickly enter their needs and poof! Out pops an optimal program.
Instead, teachers and administrators must resort to a lesson many of us recall from Algebra I: trial and error. Notice a need, find a product, give it a shot and if it doesn't work, replace it. It’s a simple idea that, while easy to sell, most school leaders with experience testing out products would scoff at. That’s because edtech pilots are far more costly and time consuming than factoring trinomials, and leave many busy educators feeling overwhelmed by the process of self-evaluating product quality and effectiveness.
“We’re in an age now where tech is moving so rapidly and we don’t want to just do something to do it,” Todd Keruskin, assistant superintendent at Elizabeth Forward School District in Pennsylvania, says in a post from DC-based nonprofit Digital Promise. “We want to be able to analyze results.”
Noticing that struggle, Digital Promise decided to carve a step-by-step guide, the Ed-Tech Pilot Framework, for those interested in—or perhaps intimidated by—evaluating an edtech product. “Districts rely heavily on edtech pilots to make purchasing decisions, but the pilots are informal and might not generate the evidence they need for a smart purchasing decision,” says Aubrey Francisco, a research director at Digital Promise. “There was a need to define pilots and come up with a process to help them gather the necessary info to make those decisions.”
The free online tool provides clarity to the otherwise ambiguous and consumptive process of edtech procurement through eight steps that school leaders, educators and administrators can follow whether they’re starting from scratch or have already narrowed their focus.
The extensive and step-specific resources are meant to work as a reference material, Francisco explains. “Each pilot is unique and has a unique need. What we think about is how much does it cost and how much time does it take.”
Producing the framework itself was somewhat of a trial-and-error process. Over the past three years, Digital Promise reviewed findings from pilot studies of 15 products in 14 of the nonprofit’s League of Innovative Schools districts—a national network of education leaders working to improve student outcome through technology and partnerships. “We wanted to learn with the League what the challenges with pilots were… After that we had a good idea of how we thought these pilots could be structured, so we conducted pilots ourselves,” Francisco says. “This is a collection of lessons learned.”
That advice is a good starting point. With few outlets available for districts and educators to relay their experiences with products and pilots, simply sharing what works and what doesn’t could help cut down on the error portion of trial and error, and make product pilots more effective, efficient, and most importantly, informative.