Let's begin with the assumption that the fundamental purpose of schooling is to ensure high levels of learning for all students. It follows, then, that the most important criterion for determining whether educators are "doing data right" is whether their use of data leads to improved student learning.
Unfortunately, this was not my assumption when I started as a classroom teacher in the 1970s. Like most teachers then, my use of data was limited to monitoring attendance and assigning grades to my students.
I was actually quite proud of the assessments I created to determine student grades. Unlike many of my colleagues, who relied exclusively on multiple-choice, matching, and true-or-false questions because they were easier to grade, I included at least one comprehensive essay question and a few short-answer items on every unit test. I felt these more open-ended queries gave me greater insight into student understanding.
I also knew that prompt feedback was important to effective teaching, so I gave my assessments on Friday, spent the weekend grading essays, and returned the test to students on Monday. I had a sense of smug self-satisfaction because I believed that my challenging assessments, my willingness to devote hours to grading papers, and my commitment to returning tests promptly was proof positive that I was a great teacher.
I followed a specific ritual on the Mondays after the test. I would distribute the tests so students could see their grades. I would go over and clarify any areas of general confusion. I would ask whether there were any questions. Finally, I would ask students to return their tests to me. As they did so, it signaled the same thing to every student I taught—This unit is over. I got my grade, and DuFour is moving on.
My assessments and the data they generated served a single purpose—to assign grades, which typically fell along a bell-shaped curve. It never even occurred to me to review the results with colleagues, to use this evidence of student learning to inform and improve my teaching, or to provide students with additional time and support to master the content. Some students performed well on my assessments, proving my effectiveness as a teacher. Poor performance, I felt, was not a reflection on me but rather a reflection on individual students who either lacked ability or hadn't put forth enough effort.
The Rise of Professional Learning Communities
Fortunately, our profession has come a long way since the 1970s. The highest-performing school systems recognize that a school can only be as good as the educators within it. So they organize their schools into professional learning communities (PLCs) to provide the timely, ongoing, job-embedded, data-driven adult learning essential to continual improvement.
The rise of the professional learning community process has altered teaching in significant ways. We're transitioning from an era in which what was taught, how learning was assessed, what instructional materials were used, and how grades were assigned were all determined by the individual teacher to whom a student was randomly assigned. Now we're asking teachers to work in collaborative teams to achieve common goals for which they are mutually accountable. Teams are charged to create a guaranteed and viable curriculum, unit by unit, to ensure that all students have access to the same essential knowledge and skills. Teams are asked to use ongoing formative assessment in the classroom and one or more team-developed common formative assessments during each unit. Team members must agree on what proficient student work looks like so they can give students consistent feedback.
Data-Using PLCs Ask Four Questions
The biggest difference between traditional schools of the past and high-performing professional learning communities today is in their approach to data. When members of a collaborative team in a PLC analyze the results from their common assessments, they use evidence of student learning in specific ways using the following four questions.
1. Which students were unable to demonstrate proficiency on this assessment?
The team identifies students by name and by need to ensure that any who are struggling will access the school's system of intervention until they can demonstrate proficiency. This intervention is timely—it occurs immediately after the assessment. It is directive (not invitational) and provided in such a way that the student is not removed from new direct instruction. It is diagnostic—the team is able to say "these students are unable to subtract two-digit integers," rather than "this student needs help with math." Finally, the intervention is systematic. Resolving the problems of students who are struggling does not fall solely on the classroom teacher; instead, the school has a plan to provide students with additional time and support for learning beyond what occurs during the class period.
Mason Crest Elementary School in Fairfax County, Virginia, is a Title I school of 600 students who speak 37 different languages; about half of the students are in the English as a second language program. Teachers carefully analyze the results of their team-developed common assessments to determine the specific learning needs of each student. As reading teacher Jacquie Heller explains,
Just giving the assessment and doing nothing with it is a waste of everybody's time. And it's criminal to let good information that could help kids go to waste. Our gathering evidence of student learning tells us what we need to know about each child and about our own instruction. For example, we don't just give the Developmental Reading Assessment and find out we have a dozen kids reading on Level 16. We ask why they couldn't go up to an 18, and we look at their responses to see if they need to work on decoding, fluency, or comprehension to get there. Only then can we split the kids reading at a Level 16 into purposeful groups for targeted instruction.
Mason Crest teachers use Google Docs to review results from exit tickets, classroom tasks, and common assessments in mathematics to monitor student work. A team doesn't merely look at whether a student's answer was correct, but rather examines student work to look for the nature of the error to provide appropriate intervention.
The results at Mason Crest have been dramatic: In 2015, the school achieved 92 percent proficiency in English, 97 percent in history, 95 percent in mathematics, and 87 percent in science, far surpassing student proficiency targets in every subject area.
2. Which students are highly proficient and would benefit from extended or accelerated learning?
The team also looks to see whether there are students who are consistently demonstrating advanced proficiency and would benefit from working with their intellectual peers on learning tasks matched to their performance. A meta-analysis of the research on accelerated learning (as opposed to tracking) found that it greatly improved student learning.
Schaumburg School District 54 in suburban Chicago is a minority-majority district that has established the PLC process in each of its 27 schools. Every school has established time for intervention and enrichment based on individual student needs identified through team-developed common formative assessments.
For example, Muir Elementary School administers common assessments every other week in language arts and mathematics to determine student needs. Collaborative teams use data from these assessments to identify both students who are not yet proficient and those who have mastered the material. In planning a unit, teams at Muir address not only how they will provide additional time and support for struggling students, but also how they will challenge the students who need extension activities.
The school schedule provides a 40-minute intervention/extension block for language arts and a 30-minute block for mathematics each day. Enrichment typically takes the form of inquiry-based, collaborative problem-solving and independent work. Three to five additional certified teachers—English language learning, special education, reading specialists, mathematics specialists, and an enrichment coach—flood into the grade level during the designated intervention/extension time to provide additional support for students and to keep groups small.
These efforts helped the school dramatically increase the percentage of students who are proficient in reading from 65 percent in fall 2014 to 83 percent in spring 2015. The increase in student proficiency in mathematics in that same time period was even greater, from 62 percent to 85 percent.
3. Did one or more colleagues have excellent results in an area where my students struggled? What can I learn from my colleagues to improve my individual practice?
The transparency of the evidence of student learning in high-performing collaborative teams allows members of the team to be open about their individual strengths and weaknesses. A teacher who is experiencing success teaching a particular skill can share lesson plans, have her colleagues observe her teaching a lesson on the skill, or videotape a lesson and analyze it with a colleague.
For example, the 2nd grade team at Mason Crest found that one teacher's students had dramatically outperformed her colleagues' students in solving word problems. The team discovered that the teacher had her students act out the problems in large and small groups. When her team members replicated her strategy and experienced similar gains in student understanding, other teams in the school incorporated the strategy as well.
4. Is there an area in which none of us achieved the results we expected? What do we need to learn as a team to teach this skill or concept more effectively?
There will be times when, despite their best efforts, everyone on the team struggles to help students achieve proficiency. When that occurs, the team must consider why students are having difficulty and where the team can turn to acquire new strategies for teaching the skill or concept in question. The team then implements these new strategies in the classroom, sets a short-term goal to improve student mastery of the skill, and analyzes results from a follow-up common assessment to determine whether the new strategies led to higher levels of student learning.
Sanger Unified School District in California provides an excellent example of a districtwide attempt to improve student achievement by helping teachers identify more effective instructional strategies through the PLC process. In 2004, Sanger, which serves an area with one of the lowest median incomes in the United States, was one of the first districts in California to go into state program improvement because of consistently low student achievement.
Sanger's educators responded by launching a comprehensive improvement effort. Staff members worked in collaborative teams to clarify the essential outcomes for each instructional unit, developed common formative assessments, and used the results from those assessments to intervene for students and improve their own individual and collective instructional practice.
According to Rich Smith, the assistant superintendent of Sanger when the improvement initiative was launched, teachers began to view common assessments as powerful tools for improving their instruction rather than an intrusion on their teaching time. Staff members who had once frequently complained about administering too many assessments began to integrate common assessments into every unit and use the evidence of student learning to identify strengths and weaknesses in their instruction. If no one on the team was succeeding in getting students to the intended proficiency level, the team sought help from other teams or support for professional development from the central office to identify more effective instructional strategies.
By 2012, Sanger schools had raised their scores on the California Academic Performance Index from 599 to 822, significantly above the state average of 788. The district's English language learners (ELLs) surpassed the state average by an even wider margin (772 to 716). The majority of the schools in the district, including those that had been assigned to program improvement, scored in the highest decile among similar schools throughout California. By 2013, the district's graduation rate of 96 percent was the highest in the area and 16 percent higher than the state average.
For another example of using data to learn from one another, consider Schaumburg School District. The district provides a data-management system that gives every school access to achievement data from other teachers, teams, and schools. A 5th grade team struggling to help ELLs demonstrate proficiency in mathematics can easily discover which 5th grade team in the district is achieving the best results for ELLs and make arrangements to learn from that high-performing team. The district also uses data to identify teachers and teams that are achieving at exceptionally high levels and invites these educators to create courses on their strategies, which are made available to educators throughout the district.
An Important Caveat
Unlike schools of the past, effective professional learning communities view data as a powerful tool for meeting the needs of individual students and for informing and improving the professional practice of the entire team. When schools use data in this way, they are certain to improve student learning.
However, here's one important caveat to register about this use of data. Too often, I have seen collaborative teams engage in the right work … up to a point. They create a guaranteed and viable curriculum, they develop common formative assessments, and they ensure that students receive additional time and support for learning through the school's system of intervention. What they fail to do, however, is use the evidence of student learning to improve instruction. They are more prone to attribute students' difficulties to the students themselves. If their analysis leads them to conclude that "students need to study harder" or "students need to do a better job with their homework assignments" or "students need to learn how to seek help when they struggle to grasp a concept," they have the wrong focus.
Rather than listing what students need to do to correct the problem, educators need to address what they can do better collectively. To engage fully in the PLC process, a team must use evidence of student learning to inform and improve the professional practice of its members.
The Ultimate Question
Any school or district that wants to get the most benefit from data must ask questions such as, Why are we gathering data in the first place? With whom are we sharing the data? and What actions are we taking as a result of our analysis of the data? The ultimate question, however, remains this: Does our collection of data lead to higher levels of student learning? Unless you can answer that question with an emphatic yes, you're not doing data right.
Copyright © 2015 Richard DuFour