Latest Department of Education Report Urges More Collaboration

Latest Department of Education Report Urges More Collaboration

The first of a three-part series analyzing the 'Expanding Evidence' Report

By Andrew Plemmons Pratt     Feb 25, 2013

Latest Department of Education Report Urges More Collaboration

Late in December, the U.S Department of Education’s Office of Educational Technology dropped a 100-page draft policy report on “Expanding Evidence Approaches for Learning in A Digital World.” While a key focus of the report is on the kinds of information that we should marshal to evaluate learning technologies, the more important lesson of the document is about people. Through case studies and reviews of current research, the report makes a lot of recommendations, but three stand out.

Part I of this review provides a backdrop for current “evidence-based” research and focuses on the first of those recommendations: the notion that technologists, educators, and education researchers must collaborate to share techniques and evidence-gathering and analysis approaches from their respective fields. Parts II and III of the review, to be published separately, advocate two other major themes woven throughout the report: 1) the need for design thinking and continual improvement processes when building digital learning tools; and 2) the need to forge stronger lines of communication between education researchers, technologists, and educators, and to share insights that might otherwise remain siloed in existing disciplines.

What Works Not Working?

The What Works Clearinghouse (WWC) is a project of the federal Department of Education’s Institute of Education Sciences which reviews existing education research on all manner of interventions and summarizes what has proven effective in schools. Its work is intended to inform educators who make “evidence-based decisions” about programs, policies, and interventions. As education technologies have proliferated, the WWC has reviewed chunks of the accumulating research on the effectiveness of those “digital learning interventions.”

According to the “Expanding Evidence” report, the WWC has published 45 reports on digital learning interventions, and found 26 of them to have positive or promising effects. By way of comparison, at the time of this writing, there were 446 products listed in EdSurge’s ever-expanding database of current edtech product reviews. That suggests that what constitutes useful evidence in one circle of education work (academic research) looks extremely different from the research in another circle (technology development or classroom teaching).

The report rightly points out that while the WWC is a useful source, it doesn’t move fast enough to supply educators with information about what works at the same pace that they must make decisions. The cycle of research for academic studies of education technology is glacial compared with the rate of new tools appearing in a given week’s edition of EdSurge. The frustrating gap between the authority of the WWC reports and the pace of their production is a microcosm of the trouble with existing evidence approaches for education technology.

Teachers, administrators, grant makers, and policy makers lean heavily on the phrase “evidence-based” when looking for the next program for teaching students. Claiming that your curriculum or software is an “evidence-based” intervention has that authoritative ring that says, “Science is on my side!” But the clever (albeit long and dense) “Expanding Evidence” report takes aim at the incredibly limited scope of evidence typical of most education research and enlarges the possibilities for what sort of information can support the assertion that an edtech tool is working, or can demonstrate that it isn’t.

Importantly, the report authors argue, the high standards of evidence for social science research usually reserved for those “evidence-based” interventions are unnecessary. While deliberate academic research like that summarized by the WWC has its place, the authors write that a critical mindshift for leveraging the current opportunities afforded by the exploding edtech sector “is accepting that the strongest level of causal evidence is not necessary for every learning resource decision” (p. 4). By expanding the kind of evidence used to evaluate education technologies, we open possibilities for experimenting with more interventions in more contexts, eliminating ineffective ones faster and honing good ones at a pace that fits educator needs.

Collaboration in Your Neighborhood and Beyond

Like a typical policy report, the “Expanding Evidence” authors distill their analysis into a list of recommendations. The first might seem like a bland call for holding hands and working together: “Developers of digital learning resources, education researchers, and educators should collaborate…” But the specifics point to a strong model for collaboration that receives insufficient elaboration. “An example of this type of collaboration that the U.S. Department of Education endorses is the move to identify and support regional innovation clusters’ purposeful partnerships to break down domain silos and create connections between researchers, the commercial sector, and educators” (p. 89).

“Regional Innovation Cluster” is a term familiar to economic development wonks and some academic researchers and tech entrepreneurs. An “innovation cluster” is a geographic area of synergetic economic activity, where the expertise of the local workforce, the investment capital of the business sector, and the researcher interests of university academics closely align. Two of the most prolific such clusters in the United States are the information technology hub of Silicon Valley and the biotechnology hub of metropolitan Boston. The recommendation here seems to suggest that with federal support, regional innovation clusters could accelerate collaborations for better education innovation.

Federal programs, coordinated through the Small Business Administration, already exist to leverage regional innovation for advanced Department of Defense research, and to accelerate development of renewable energy systems. Importing this model into the education sector is a worthy experiment, but the report is not explicit about what this would look like.

At the moment, there are a variety of ad-hoc attempts in locales around the country to spark this kind of collaboration among developers and educators. Meetups, Startup Weekends, and hackathons all bring entrepreneurs, technologists, and educators together to spark ideas for education startups. However, a consistent complaint heard at some such events has been that the proportion of educators is usually insufficient. Sometimes there are policy folks on hand, but even more rare are the education researchers that the report takes pains to highlight. And those events may be too few and far in between for the kind of sustained discussions that create a shared language and technical familiarity between all the necessary collaborators. A recent post on the Blend My Learning blog describing an educator’s experience at the first Shared Learning Collaborative Camp in the Bay Area captures this stutter-step approach to incorporating practicing educators into serious technology development.

Another report recommendation (#7) calls for universities to create interdisciplinary graduate programs that train people to combine expertise in educational data mining, learning analytics, and visual design. But this gets us back to the WWC problem. Those specialists will be trained and ready for action in 5-7 years. Neither the education technology ecosystem, the Department of Ed, nor millions of students around the country can wait that long to institutionalize the necessary collaboration.

Fortunately, the report makes liberal use of another standard policy document convention: the narrative sidebar highlighting an effective case study. In dozens of stories throughout the report, the authors implicitly make the case for the positive results that come from education researchers working side-by-side with practicing educators and technology developers.

Researchers in Texas tested software to help students understand graphical representations of everyday math concepts, then worked to extrapolate how effective the tool might be in schools with different demographics all over the state (p. 14). The Carnegie Foundation for the Advancement of Teaching created a network of four-year and community colleges that each piloted a program called Statway that carefully tracks how students are doing in early college math courses and allows administrators and instructors to intervene before students falter and fail (p. 21). Instructional design researchers at SRI International (the nonprofit R&D institute that wrote the “Expanding Evidence” report) teamed up with teachers in Denver Public Schools to test an adaptive curriculum developed by the American Geological Institute (p. 31).

These stories are all about the kind of collaboration the report recommends, but they don’t reach the level of broad, institutionalized, and sustained collaboration that a regional innovation cluster supports. Such clusters have potential, but it will be up to another report to explain how they would operate. At the very least, they should expand the design-based research explained in “Expanding Evidence.”

Three potential precursor collaborations leading up to such “regional education innovation clusters” might include:

  1. Education researchers at local universities connecting with teacher leaders (and innovative administrators) to study radically pragmatic elements of how schools work. Michael Goldstein, founder of the MATCH Charter School in Boston recently described one such project to study teacher moves in Education Next.
  2. Pathbreaking edtech startups refining the way they learn about real classrooms--not just by soliciting feedback from teachers and listening to them, but by making field trips into schools and watching how new and traditional education products really get used.
  3. School districts and funders sponsoring place-specific social or education innovation challenges and competitions that require collaboration between teachers, academics, and entrepreneurs to solve local challenges.

The next part of this review will dig into the case studies that are strong examples of this kind of work. The powerful thread connecting them all is a shift to using “design thinking” a necessary approach to implementing and improving genuine education innovations.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up