During the 1990s, school improvement plans exploded at state and district levels. Reacting to the mantra "data-driven decision making," school districts felt compelled to interpret the increasing number of assessments and to use the information effectively to improve student performance.
Our Midwestern district began its struggle with data in 1993, when more than 50 schools began their first cycle of Quality Performance Accreditation, our state's school improvement process. At the same time, we began a major curriculum-revision process to align our written, taught, and tested curriculum (English, 1988). As the alignment process progressed on the district level, we realized that our skills in data analysis needed to improve. At the building level, some leaders had used data effectively, but others had not. Although all our schools received accreditation, we knew that we could fine-tune our school improvement process to be more productive.
Beginning in fall 1997, we initiated changes that enabled all stakeholders, especially classroom teachers, to be knowledgeable users of results—both standardized, high-stakes tests and day-to-day classroom assessments. We implemented two major activities to assist teachers and principals: data notebooks for each school and a data-mentor program to develop data-analysis skills in school personnel.
Data Notebooks
The availability of data was not an issue for building leaders. However, keeping track of data sets and organizing them in a meaningful manner were challenges. A small group of principals and personnel from the Curriculum and Instruction Department came up with a format for controlling and using the collected information: data notebooks.
During the first year of use, department members organized, assembled, and distributed the notebooks. Each school received a notebook of results for its students that included performance comparisons with local, state, and national results. Principals no longer were responsible for collecting and organizing pertinent data; instead, building leaders were able to analyze, interpret, and act on existing data.
At the start of subsequent school years, schools received notebooks with dividers. The analyzed results for each assessment were sent to the schools and placed in the binders. Graphs and charts that could be reproduced for faculty and site-council meetings were included. The notebooks and the data analysis follow the same organization format from year to year, making it easy for principals to find needed data. Now they use their time to analyze their data.
Data Mentors
Data notebooks proved valuable from the first year of their distribution. However, because classroom teachers—those individuals central to the change process—had only limited opportunity to use the data, the information in the notebooks had limited applicability. To address this situation, the district implemented a data-mentor program in fall 1997. The purpose of the program was to thoroughly familiarize a few building-level personnel with the assessment process and analysis so that they could help other faculty members interpret and use the data.
District administrators initially heard about the benefits of the data-mentor program at an informational meeting. The potential for timely help with data interpretation, greater involvement of building faculty, and opportunities for networking across the district appealed to the principals. The overarching goal was to develop building-level capacity related to effective use of data. Specific objectives addressed collecting, organizing, analyzing, and interpreting data; using technology to represent data; telling a building's story with data; developing school improvement plans on the basis of data; and creating and using alternative assessments.
A building administrator and classroom teachers—two at the elementary level and three at the middle and high school levels—attended monthly two-hour sessions. Participants learned about analyzing, interpreting, and acting on data and about strategies and supportive research.
A typical meeting began with a discussion of vocabulary or technical issues, such as percentile versus percent or raw score versus standard score. The group might then discuss and analyze a generic data set, such as scores from the Iowa Test of Basic Skills, state assessments, performance assessments, or district surveys. Participants brought to each meeting their own building data relevant to the topic of the day. After discussing generic data, the administrator and teachers from each building worked together to apply the same analysis techniques to their own information.
Examples of Activities
Each session included two or three exercises to ensure the active involvement of teachers and principals.
Cohort plus/minus chart. One activity charted the progress of cohort groups on the Iowa Test of Basic Skills. Each elementary data notebook contains graphs comparing cohort performance from 3rd through 6th grade. On the basis of those graphs, staff members from each building used a plus, a minus, or an equal sign to compare performance at the two grade levels. A plus sign meant that performance improved over time; an equal sign meant that the performance stayed the same; and a minus sign meant that performance went down.
Figure 1 shows the results of a typical analysis. Teachers highlighted all the minus signs to make negative trends readily apparent. This approach enabled data mentors to easily identify undesirable performance patterns. For the school whose data are shown, capitalization, punctuation, and maps/diagrams needed attention. Participants also noticed a negative trend in the 1994 to 1997 column and discussed possible causes.
Figure 1. Comparing Cohort Progress, 3rd-6th Grade
Developing Data Mentors - table
3rd–6th 1991–1994 | 3rd–6th 1992–1995 | 3rd–6th 1993–1996 | 3rd–6th 1994–1997 | 3rd–6th 1995–1998 | |
---|---|---|---|---|---|
Vocabulary | + | + | + | - | + |
Reading Comprehension | + | + | + | - | + |
Spelling | + | + | + | + | - |
Capitalization | = | - | - | - | - |
Punctuation | - | - | - | - | - |
Usage | + | - | + | - | + |
Math Concepts | + | + | + | - | + |
Problem Solving | + | + | + | - | + |
Maps/Diagrams | + | - | - | - | - |
Reference Materials | + | + | + | - | + |
The plus/minus chart allows teachers to easily track students' progress on the subtests of the Iowa Test of Basic Skills. Plus signs indicate improvement; minus signs indicate a lack of progress.
The mentor meetings demonstrated that it was much more powerful for teachers to identify weaknesses in their building's instructional program than for outsiders from the district level to point out the same problems. Teachers came to understand that a minus sign in a particular cell did not mean that students had not learned the content, but that they had not progressed as desired.
Tracking classroom results. Analyzing data from once-a-year, high-stakes assessments was an important component of the data-mentor program, but it wasn't sufficient by itself. Teachers and principals needed to take a "short view" (Schmoker, 1996) and to track ongoing results in their classrooms and building. They needed the encouragement of seeing immediate progress, even in small increments. Data mentors learned how to use box plots to report data collected over time, such as pre- and post-tests or parallel practice activities (fig. 2).
Figure 2. Results from a 5th Grade Reading Test
By simply averaging scores from pre- and post-test results, the teacher would learn that performance for her class as a whole had improved. Because the box plot clearly delineates low and high scores, the median, and the first and third quartiles, the teacher could see that performance had improved for low-, medium-, and high-performance levels. The low score had moved up almost 20 percent; the first quartile on the post-test was roughly equivalent to the third quartile on the pre-test. The teacher had visual verification that she was reaching all her students.
School profiles. A major, unifying goal for the year was preparing for visits by accreditation teams. Each building prepared a school profile that included performance results and a school improvement plan that addressed the deficiencies identified in the profile. Most principals and teachers were proficient with word processing software, but found incorporating the necessary charts and graphs into the school profile more difficult. Groups of 20 to 24 data mentors met in a computer training lab to learn how to graph data effectively.
Program Impact
We evaluated the sessions with a standard form used for all district staff development activities, but the most significant feedback came from the participants. Each session ended at 5:30 or 6:00 in the evening, after teachers and principals had put in a long day. After the scheduled activities, most building groups stayed on to debrief. The topic of discussion was always the same: How do we go back and present this information to the entire faculty? Many teams planned their faculty presentation before they left the meeting room.
At the end of the first year, a 5th grade teacher remarked, "This is the most valuable staff development I've had since I finished my master's." Along with her team members, she identified a long list of benefits. As individuals, data mentors learned how to target students at both ends of the achievement spectrum, as well as students in the middle. By using data better, they became more adept at devising strategies to reach at-risk students. At the building level, they sensed that faculty members felt a greater ownership of the school improvement process because they better understood how data could pinpoint specific needs. Staff members found the process of developing a school improvement plan much easier than their first experience four years earlier because they had a solid foundation in analyzing the facts and figures available to them.
Reflections from secondary teachers were similar. An unexpected bonus was increased communication across departments. A language arts teacher and a mathematics teacher provided insight about improved communication and the overall value of their efforts: We saw a total picture, versus just our own department. . . . We increased communication across the curriculum. We got the staff involved with the whole picture so they could see the difference we all can make in our own classrooms. We've communicated with parents and the community about expectations. We're all beginning to believe.
The Program Continues
Data-mentor sessions continued in 1998–99, but for four meetings instead of eight. These meetings emphasized results-based staff development. How could building personnel develop baseline data and then demonstrate growth? How could the impact of specific staff development activities on the performance of students be documented? Much discussion centered on developing and effectively using rubrics for both teachers and students.
In the 1999–2000 school year, the focus shifted from data from large-scale testing to data from assessment activities created by individual teachers. Research indicates that schools with the biggest gains in student performance include high-quality classroom assessments in their school improvement plan (Black & Wiliam, 1998). More than 100 teachers and administrators are meeting in study groups and using the same resources as they focus on developing student-centered assessment practices that serve both instructional and accountability purposes (Stiggins, 1996). Teachers are improving old techniques, such as writing better multiple-choice tests, and trying new strategies in their classrooms. Teachers at all levels are developing performance assessments and scoring rubrics and are finding the process both difficult and rewarding. One teacher commented, "This is the hardest thing I've ever done." Another said, "I'm going to do a lot of this. The kids love it and it's good for them!"
As a result of the study groups, students are more involved in the assessment process. They create their own test questions (sometimes for review, sometimes for the actual tests), collaborate with their teachers in developing scoring guidelines, and evaluate their own work by using those guidelines. Some students have led their own parent-teacher-student conferences, proudly reporting both the areas that show progress and the areas that need improvement.
Our district regards achievement highly. The challenge is to identify how staff members can help students continue to improve their performance. The data-mentor program met that challenge. During 1998–99, 6th grade scores on the norm-referenced test used by the district improved on every subtest. Third grade scores improved in all areas but two. Scores at the elementary level on state assessments also increased on all tests, rising dramatically in mathematics for all groups—minorities, students with disabilities, and students of lower socioeconomic status. Middle school scores remain at a consistently high level. Scores on college entrance exams for 1999 graduates dipped slightly from the all-time high scores registered in 1998 but still reflected sizeable growth over five years.
We believe that more effective use of data has contributed to increased performance by our students. We look forward to ongoing improvement as classroom teachers hone their individual assessment skills and students become more involved in evaluating their own performance.