By Janet Angelis
Phrases like “data-driven instruction” and “data-driven decision making” make me wonder exactly who is driving, the data or the people? If we don’t stop to ask, “Where are we going?” “For what purpose?” and “What evidence do these data provide?” we risk putting data in the driver’s seat and going along for a ride we know not where. We zig and we zag in whichever direction the latest data set suggests we might have a weakness.
Another problem can be too much data. What I see happening in some systems — from state ed departments out to district, school, and classroom levels — is educators and policy makers becoming seduced, then overwhelmed by data. To be more exact, they are collecting and/or distributing so much data that recipients can’t process it all or figure out which data will give them the evidence they need to determine if they’re making progress toward their goals. Too much data can become static in the system — like a fuzzy radio signal, the data become a distraction that draws attention away from the program.
One example of this is the process some states use to evaluate their lowest-performing schools. The schools may receive a mound of data documenting areas of weakness but no guidance on actionable steps to decide where or how to start — nor, more important, suggestions and support for creating the conditions that will make it possible to take those steps. Another example is the new teacher evaluation systems required to get Race to the Top monies; these systems give teachers more detailed performance scores, but unless they include provisions for giving helpful feedback and the support to act on it, they are simply that — scores, more raw data.
What to do? If we start with goals and then work backwards to determine what evidence is needed to measure progress, we can then consider what data might be collected and analyzed to provide that evidence. As successful businesses know, people need to be in the driver’s seat, selecting and using data to guide and inform their actions. Without enough knowledge or resources — including time — to consider which data matter most and how to convert them to useful evidence to make actionable decisions, educators are too often confronted by volumes of data that provide so little guidance about action that the data become meaningless.