All I Know About Data, I Learned From Buying a TV

Big Data

All I Know About Data, I Learned From Buying a TV

By Nick Sheltrown     Apr 2, 2015

All I Know About Data, I Learned From Buying a TV

Each day educators make decisions that influence the quality of student learning experiences. Some of the decisions may work at the atomic level (Does this student know the ‘ch’ sound?); others operate at the system level (What intervention software should we purchase for our elementary schools?). While the context may vary widely across schools and classrooms, the goal is always the same: to make the best decisions possible.

How then should educators make effective decisions? The easy answer is to say, “use data,” but ask an educator what that means and you’ll often get an answer shallow on specifics. The National Council on Teaching Quality and the US Department of Education have both reported educators’ lack of comfort in using data as part of their decision-making process. This discomfort is particularly acute when educators are asked to use data to inform large-scale purchasing decisions. The Education Industry Association’s (EIA) white paper on improving edtech purchasing revealed that while school leaders acknowledge rigorous evidence is best for making procurement decisions, the EIA also found that districts often lack a “structured, data-driven approaches with clear and inclusive decision-making processes.”

I’m surprised we lack confidence in using data in education, and I don’t think it’s for lack of skill or knowledge. Rather, I believe the problem is one of transference: a failure to apply learning in one context to learning in another. We use evidence every day to make decisions, from which route to take home from work (Google Maps commute estimates) to how we make plans for the weekend (weather forecasts, event reviews, etc.). The use of evidence is particularly important when we make buying decisions.

Consider the common experience of buying a TV. TV quality is described by the industry through many attributes: size, contrast ratio, refresh rate, resolution, power usage, screen uniformity, color saturation, screen type (matte, glossy), and “smart” features. Combine these factors with brand reputation, and it’s a complex landscape.

Lacking background, experience, or expertise in televisions, how does one make an informed choice? First, there’s browsing technology sites that feature expert reviews, testing and independent analysis. Second, one can work with local home theater experts to understand the specifics of your viewing experience (such as viewing angles, sound, and placement). Lastly, one may review the results of large numbers of customer reviews, such as those on Triangulating on expert evaluation, your specific needs, and customer satisfaction, you can reasonably identify a TV that should meet your needs.

This story could be told about many facets of our lives outside of school where we seamlessly apply multiple forms of evidence to make an informed decision. Yet, in our professional lives, we fall victim to paralysis-by-analysis, concerned that we don’t have enough information to make important decisions. Certainly the stakes are higher when making decisions about student learning than TV quality, but by applying lessons of data use from our personal lives, we can make better, more-informed decisions, particularly when making large-scale purchasing decisions.

1. No matter how much evidence you collect, some uncertainty is inevitable. Most decisions start in a place of uncertainty: things we don’t know could impact our chances of success. We collect evidence to reduce uncertainty, but no matter what we do, there’s no such thing as a sure thing. Paraphrasing Douglas Hubbard, the purpose of evidence is to reduce uncertainty, not eliminate it. Even after you consider evidence from experts, efficacy studies, and program reviews, you never are 100% sure the initiative you are launching will have the impact you seek.

2. Multiple sources of information are important. As we saw in the TV selection process, decisions are improved when they are informed by many sources of information. Talk to experts, confer with other districts, and review efficacy data and white papers to understand what goes into the successful implementation of a program you are considering.

3. Our world is probabilistic. Our experiences are best understood as the probabilities of potential outcomes. Even when you have solid, gold-standard research, it is important to remember that solutions work on average, but underneath those averages is variance. No new program will work consistently well for all students all the time, but coupling a well-designed program with thoughtful implementation increases the probability of success. The goal of school leaders should be to implement learning solutions with the highest probability of success for the target population.

4. Local context matters. When you buy a TV, local environmental factors matter (the amount of sunlight in the room, placement, room size). The same is true with a new educational initiative. As an ecosystem of people, practices, and processes, schools introduce particularities that shape how we educate students. Finding the right solution is a combination of understanding the range of options and coupling that with local needs.

5. Monitor carefully. No matter what you do, monitor and adjust as you go. One of the biggest problems in many initiatives is lack of regular use. It’s difficult to find national statistics on technology use, but long-time technology critic Larry Cuban estimates that 60-70% of teachers use technology for instruction once a month or less. Don’t assume that educators will cling to a new learning tool just because it is available.

It was Jim Barksdale, the former CEO of Netscape, who said, “If we have data, let’s look at data. If all we have are opinions, let’s go with mine.” The truth is that most decision-making isn’t one or the other. It’s combining evidence with professional judgment to make the best decision possible with the available evidence.

In education, we don’t have time to wait for perfect information leading to 100% certainty of success. The time it takes to gather such information may be years, during which time many students will have passed through our doors. By collecting evidence, understanding local needs, implementing thoughtfully, and monitoring carefully, educators can make sound decisions that will support student learning in the long run.

Dr. Nick Sheltrown is Director of Learning Analytics at Compass Learning.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up