Recent Posts >
The “Data Wise” Improvement Process
by Kathryn Parker Boudett, Elizabeth A. City, and Richard J. Murnane on January 1,2006
The following article originally appeared in The Harvard Education Letter (volume 22, number 1). Copyright 2006 President and Fellows of Harvard College. All rights reserved.
The package containing data from last spring’s mandatory state exam landed with a thud on principal Roger Bolton’s desk. The local newspaper had already published an article listing Franklin High as a school “in need of improvement.” Now this package from the state offered the gory details. Roger had five years of packages like this one, sharing shelf space with binders and boxes filled with results from the other assessments required by the district and state. The sheer mass of paper was overwhelming. Roger wanted to believe that there was something his faculty could learn from all these numbers that would help them increase student learning. But he didn’t know where to start.
School leaders across the nation share Roger’s frustration. The barriers to constructive, regular use of student assessment data to improve instruction can seem insurmountable. There is just so much data. Where do you start? How do you make time for the work? How do you build your faculty’s skill in interpreting data sensibly? How do you build a culture that focuses on improvement, not blame? How do you maintain momentum in the face of all the other demands at your school?
Our group of faculty and doctoral students at the Harvard Graduate School of Education and school leaders from three Boston public schools worked together for over two years to figure out what school leaders need to know and do to ensure that the piles of student assessment results landing on their desks are used to improve student learning in their schools. We have found that organizing the work of instructional improvement around a process that has specific, manageable steps helps educators build confidence and skill in using data. After much discussion, we settled on a process that includes eight distinct steps school leaders can take to use their student assessment data effectively, and organized these steps into three phases: Prepare, Inquire, and Act.
The “Data Wise” Improvement Process graphic shown here illustrates the cyclical nature of this work. Initially, schools prepare for the work by establishing a foundation for learning from student assessment results. Schools then inquire—look for patterns in the data that indicate shortcomings in teaching and learning—and subsequently act on what they learn by designing and implementing instructional improvements. Schools can then cycle back through inquiry and further action in a process of ongoing improvement. In the brief overview below, we outline the steps in what can be both a messy and ultimately satisfying undertaking.
The "Data Wise" District
What can district administrators do to support schools in becoming “data wise”?
1. Set Up a Data System
Whether the district creates its own system or purchases a software program, administrators must consider:
• What data to include
• How to organize it and update it regularly
• Computational power vs. ease of use
• How to balance access and confidentiality
2. Create Incentives
One incentive is to require that school improvement plans be based on student assessment results. If schools with strong improvement plans and proven results are granted more autonomy, this can motivate school teams to do the analysis work well.
3. Support New Skills
School staffs will need professional development to support a variety of skills:
• How to interpret and use assessment data
• How to access data and create graphic displays
• How to participate productively in group discussions
• How to develop, implement, and assess action plans
4. Find the Time
Teachers need time to work together in order to learn and implement these new skills. Options can include:
• Scheduling a weekly early release day
• Paying substitutes to cover classes
• Compensating teachers for extra time
5. Model the Work
District leaders can also model the “Data Wise” Improvement Process. This may be new and challenging work for most members of the central office team, but it sends a strong message to the district’s schools.
Step 1. Organizing for Collaborative Work
Ongoing conversations around data are an important way to increase staff capacity to both understand and carry out school improvement work. School leaders who regularly engage their faculties in meaningful discussions of assessment results and other student data often describe themselves as being committed to building a “data culture” or “culture of inquiry.” To build this kind of culture, your school will need to establish a data team to handle the technical and organizational aspects of the work, including compiling an inventory of data from various sources and managing this information. You will also want to establish team structures and schedules that enable collaborative work among faculty members, and engage in careful planning and facilitation to ensure that collaborative work is productive. Because looking deeply at student performance and teaching practice can be uncomfortable at first, you may find that using formal protocols to structure group discussions can be quite helpful.
Step 2: Building Assessment Literacy
When you look through the assessment reports for your school, it can sometimes feel as if they are written in a different language. So many terms, so many caveats, so many footnotes! As a school leader, how can you help your faculty begin to make sense of it all? An essential step in the “Prepare” phase is to help your faculty develop assessment literacy. To interpret score reports, it helps to understand the different types of assessments and the various scales that are used. To appreciate what inferences may be drawn from these reports and which differences in outcomes are meaningful, familiarity with key concepts such as reliability, validity, measurement error, and sampling error can really help. It is also important to have a candid discussion with your faculty about why “gaming the system” by teaching to the test may not serve students well.
Step 3: Creating a Data Overview
As you move into the “Inquiry” phase of the process, a good starting place is to have your data team create graphic displays of your standardized test results. Schools often receive assessment reports in a format that can be quite overwhelming. With a modest investment in learning technical skills, your data team can repackage these results to make it easier for your faculty to see patterns in the data. As a school leader, you can then engage your teachers and administrators in constructive conversations about what they see in the data overview. Again, using protocols to structure conversations can help ensure that these discussions are productive.
Step 4: Digging into Student Data
Once your faculty has discussed the data overview, it is time to dig into student data to identify a “learner-centered problem”—a problem of understanding or skill that is common to many students and underlies their performance on assessments. In this step of the process, you may look deeply into the data sources you investigated for your data overview. You will also go on to investigate other data sources to look for patterns or inconsistencies.
“Triangulating Data”: Digging Deeper into Multiple Sources
A central premise of the “Data Wise” Improvement Process is that it is important to examine a wide range of data, not just results from standardized tests. Many schools use analysis of individual test items as a starting point in the effort to understand student thinking. In item analysis, you first look at test items in groups by content (such as geometry) or type (such as multiple choice) to see if there are gaps in specific skills. Then you look for patterns across groups of similar items. Finally, you look more closely at individual test items to hypothesize why students responded to certain questions in particular ways.
Schools can then “triangulate” their findings by using multiple data sources to illuminate, confirm, or dispute their initial hypotheses. Sources may include classroom projects, lab reports, reading journals, unit tests, homework, or teacher observations. Another rich source of data is the students themselves. Conducting focus groups with students to talk about their thinking can be very helpful.
When triangulating data, prepare to be surprised. It is important to approach the process with the idea that you will find something new. When the goal is merely to confirm a hypothesis, only particular kinds of data tend to be looked at and the work often stops when the hypothesis is confirmed. Instead, look for and embrace unexpected trends and leads. The process of digging into data can deepen your faculty’s understanding of student performance, help you move past “stuck points” (“We’re teaching it, but they’re not getting it!”), and allow you to come to a shared understanding of the skills or knowledge around which your students need the most support.
Step 5: Examining Instruction
In order to solve your learner-centered problem, it is important at this stage to reframe it as a “problem of practice” that your faculty will tackle. Now the challenge is to develop a shared understanding of what effective instruction around this issue would look like. School leaders can help teachers become skilled at examining practice, articulating what is actually happening in classrooms, and comparing it to the kind of instruction that is needed.
Step 6: Developing an Action Plan
Solutions at last! It may seem as though you have to work through a large number of steps before deciding what to do about the issues suggested by your data. But because of the careful work you have done so far, the remaining steps will go more smoothly. In this first step of the “Act” phase of the work, you begin by deciding on an instructional strategy that will solve the problem of practice you identified. You then work collaboratively to describe what this strategy will look like when implemented in classrooms. Then it is time to put the plan down on paper. By documenting team members’ roles and responsibilities, you build internal accountability. By identifying the professional development and instruction your team will need and including it in your action plan, you let teachers know they will be supported every step of the way.
Step 7: Planning to Assess Progress
Before implementing your plan, you need to figure out how you will measure its success. Too often, educators skip this step and find themselves deep into implementation without a clear sense of how they will assess progress. As a school leader, you can help your school decide in advance what short-, medium-, and long-term data you will gather and how you will gather it. You can then work together to set clear short-, medium-, and long-term goals for student improvement.
Step 8: Acting and Assessing
Your school team worked hard to put their action plan ideas down on paper. Now that it is time to bring the ideas up off the paper, four questions can guide your work as a school leader: Are we all on the same page? Are we doing what we said we’d do? Are our students learning more? Where do we go from here? Implementation of the action plan can be like conducting an experiment in which you test your theories of how instructional strategies lead to student learning.
We made a very conscious decision to draw the “Data Wise” Improvement Process as an arrow curving back on itself. Once you get to the “end” of the “Act” phase, you continue to repeat the cycle with further inquiry. As the practice of using a structured approach to improving instruction becomes ingrained, you may find it easier to know what questions to ask, how to examine the data, and how to support teachers and students. You will also be able to go deeper into the work, asking tougher questions, setting higher goals, and involving more people in using data wisely.
About the Author:
Kathryn Parker Boudett teaches at the Harvard Graduate School of Education.
Elizabeth A. City teaches aspiring principals in Boston’s School Leadership Institute and is a doctoral student at the Harvard Graduate School of Education.
Richard J. Murnane, an economist, is the Thompson Professor of Education and Society at the Harvard Graduate School of Education. This article is adapted from
Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning, edited by Kathryn Parker Boudett, Elizabeth A. City, and Richard J. Murnane (Harvard Education Press, 2005).