Your Data Lack Value, and What You Can Do About It

Opinion | Big Data

Your Data Lack Value, and What You Can Do About It

By Nick Sheltrown     Nov 30, 2014

Your Data Lack Value, and What You Can Do About It

When it comes to data use in schools, our rhetoric outpaces reality. Even though many school districts lay claim to data-driven instruction, too often the expression serves only as a convenient slogan for school improvement plans, conference presentations, and accreditation documents.

Early results of EdSurge’s sentiment analysis directionally indicate that many working in schools lack confidence in using data (75% of respondents identify themselves as needing help or a work-in-progress in using data). This sentiment is also reflected in research literature suggesting that when data systems are used in schools, accuracy in interpreting data by educators is less than 50%. Clearly, our data lack value.

There are two common responses to this problem: (1) train teachers and school leaders to use data more effectively or (2) improve how we share data with educators. Professional development is a critical component, but teacher skill is not the reason educational data lack value. Any system that blames end-users for lack of effective use is masking its own shortcomings.

Rather, the problem with using data lies in the system, or more accurately, the lack of a system.

The problem begins with our choice of words: “data driven” misstates how we should approach data work. To say, “data driven” is to confuse the tool with the goal (borrowing from Stanley Katz’s axiom). Data are a means to an end, but not the end themselves. The phrase “data-driven decisions” reverses the logical process of data-informed decision making because it suggests that we begin with data.

But how do we know we have the data we need? Rather than starting with data, reports, or statistics, start with clear goals and a clear process. Consider these four steps to define the components of data work:

  1. Articulate the information need: What do we need to know?
  2. Identify the best measures to collect data aligned to our needs.
  3. Develop processes to collect, analyze, and present the data.
  4. Monitor how data are used; make adjustments as necessary.

Starting with data doesn’t make sense because it presumes you have the right data, which assumes the right measures, which require the right questions. Data are the building blocks for construction of insights, but the blueprint comes from our needs, measures, and course of action. We’ll examine each step in more detail.

1. Articulate the information need: What do we need to know?

All data work begins with identifying information needs. Most schools have a wide ranging set of information needs (from understanding attendance patterns to performing staff evaluations). While information gaps in these areas should be addressed, the essential information needs in schools are those that involve learning. Below are five core instructional activities that require robust data:

  1. Place: Schools require data to place students into special programs, special education, intervention, gifted and talented, etc.
  2. Diagnosis: An essential information need is assessing student learning needs so as to provide a targeted prescription.
  3. Inform: Adjusting course as students receive instruction based on their response to the instruction is perhaps the core use of evidence.
  4. Predict: Predicting future student performance so as to adjust present action is becoming increasingly common.
  5. Summarize: Summarizing what students know about content (proficiency) or have learned over a period of time (growth) is an important part of any learning feedback loop.

2. Identify the best measures to collect data aligned to our needs

Schools that have isolated their critical information gaps understand well the data they need. The next step is to identify the best measures to support those needs. Needs explain why we measure, but the tactics of measurement (how and what) still need to be determined. The figure below reveals how our five learning data needs connect to the what and how of learning measurement.

Notice that some whys essentially measure the same thing (performance against grade-level standards), but not in the same way. A test built to forecast will likely measure standards, but it may sample across all standards to produce a prediction; whereas, as diagnostic assessment may provide more targeted assessment of a few standards. No matter the why or what, we have two common choices for how we evaluate learning data: criterion or norm referencing. Criterion referencing identifies an evaluation rule independent of the data (score of x is proficient) and labels performance accordingly. Normative referencing evaluates student performance in the context of their peers (score of x is at the 60th percentile of all students).

3. Develop processes to collect, analyze and present the data

Once we articulate a need and implement the right measures, we’ll have the data we need. But even then, what do we do with it?

Analytical leaders need to clearly shape the analysis and results of data to invite action. One of the key reasons that data lack value is because the way in which we analyze and share data is disconnected from ideas on how to use it.

Sharing six-month old state assessment data broken out by gender, so we can learn that last year’s proficiency rate for 5th grade girls was 6% higher than that of 5th grade boys, doesn’t yield a meaningful course of action. To correct for this, we need to return to our purpose: why did we administer the assessment in the first place?

If it is to screen students for an intervention program, then we should frame the analysis and presentation of data in that context. Good analytics and reporting frame the data in a context for action. “To document and explain a process, to make verbs visible, is at the heart of information design,” writes Edward Tufte in his classic text, Visual Explanations. If you cannot clearly articulate what the recipient of a report would do with the data they receive, how can we expect them to do so?

4. Monitor how data are used; make adjustments as necessary

Too often, we invest considerable resources in building a data system without a similar commitment to improve the system over time. We become enamored with what we have rather than what people do with data.

To remedy this, school and teacher leaders need explicit processes for assessing the extent to which data improve educational decision making. A starting point is to examine usage data, which can be pulled from most data systems. These data can reveal frequency of access; however, accessing data is an indicator of effective use, but not effective use itself. To assess the true impact of learning data, leaders need qualitative data (observations, interviews, focus groups) on how teachers make decisions. Any number of factors can limit the impact of data: technological reliability, assessment quality, presentation of data, lack of time or understanding, data latency, and more.

There is ample support in the research literature that using data to guide instructional decisions can greatly improve student learning outcomes, but data don’t magically transform teaching. School and teacher leaders need to work together to identify their core needs, measures, and means of leveraging the data--only then can data really drive.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up