THE BOTTOM LINE: Can you tell if your edtech is working? How long do you test it? These questions weigh heavily on teachers and any school district with a tight budget. The Department of Education awarded a $3.67 million contract to Mathematica Policy Research to develop some answers. The task is to create a suite of tools that help districts evaluate within one to three months whether an edtech tool purchased with funds from the Elementary and Secondary Education Act is helping students. The Department hopes that the process, classified as a "rapid-cycle technology evaluation," will educate and guide both educators and entrepreneurs through the research design appropriate for their tools, though Mathematica said it may not be suitable for all tools. Mathematica has partnered with SRI International on the project.
"The key," Alexandra Resch, a senior researcher at Mathematica and the project's lead, told Education Week, "is having some sort of valid comparison group that can give you a sense of what would have happened without that tool."
The first and second phases of the project will take place over 16 months with a budget of $1.43 million. In the first phase, Mathematica and the Department hope to produce between three and six pilot programs. The second phase will consist of field tests of eight to twelve edtech resources. The project includes an additional, optional phase, which is budgeted at $2.24 million but is, as yet, unfunded, to conduct large-scale rapid-cycle tech evaluations of up to 60 apps. Whatever work Mathematica produces under the contract have an open license, as per the Department of Education's recently proposed legislation.