HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
December 1, 2025
5 min (est.)
Vol. 83
No. 4

Cutting Through Initiative Overload

+3
author avatar
When districts let the evidence speak, they’ll find that fewer initiatives deliver greater impact.

premium resources logo

Premium Resource

School & District Leadership
Illustration of a red hand and blue hand with fingers looped through the handles of a pair of large, partially open scissors.
Credit: Harry Campbell / theispot
In districts nationwide, school leaders face a tangle of duplicative programs, legacy initiatives, shiny new grant opportunities, and one-off projects. If a district administrator asked school staff what strategies the district is focused on, the response might be an exhausted, “What aren’t we focused on?” The problem isn’t too little effort, but rather too much effort on too many initiatives without justification through evidence. Initiative overload results in fragmentation, frustration, and diluted impact.
Data is the key that helps unlock coherence. When leaders embrace a mindset of continuous improvement, a disciplined approach to inquiry, and structured processes for reviewing evidence of impact, districts can focus on efforts that move goals farther, faster (Bryk et al., 2015).

Taking Inventory

Consider the example of Guilford County Schools (GCS), the third largest school district in North Carolina, serving nearly 67,000 preK–12 students at 120 schools. Like many districts, GCS used pandemic relief funds to support learning recovery. In the fall of 2024, with ESSER funding set to expire, the district faced urgent decisions about which initiatives to sustain.
First, the district needed to establish a common understanding of what initiatives were in place. According to the National Implementation Research Network (NIRN), out of the University of North Carolina at Chapel Hill, “Initiatives are priority efforts, strategies, or projects in which an agency is directly engaged to produce change that results in desired outcomes (e.g., improved outcomes for learners)” (NIRN & SISEP, n.d. – a). Guided by the NIRN’s Initiative Inventory process, cabinet leadership worked with department leads to gather details for each initiative including a description, definition of success, target population, projected budget, and data sources (NIRN & SISEP, n.d. - b). This inventory surfaced over 120 unique initiatives across the district’s strategic direction focus areas: accelerate learning; recruit, retain, and reward top talent; strengthen health, safety, and wellness in schools; and prepare students for the world.

District leaders cannot understand the effectiveness of an initiative without understanding the story its implementation data tells.

Author Image

Prioritization Rubric for Strategic Budgeting

To support a structured, equitable review of these initiatives before the ESSER funding cliff, the strategic planning team developed an Initiative Prioritization Rubric informed by the District Management Group’s article on strategic budgeting (Kim, Newell, & Choi, 2023).
The rubric (see fig. 1) includes five components: cost level, scale, type of evidence available, effectiveness of initiative, and alignment to a district focus area priority and/or commitment. Here’s how each component works in practice:
  • Cost level: The cost level should include direct costs, such as the actual materials to run the initiative, as well as indirect costs, such as operational support. Cost levels in the rubric can be determined based on local requirements for board approval.
  • Scale: An initiative may be designed for only certain staff members or student groups, or piloted at only a few campuses or with specific grade levels. Intersections at varying levels may impact scale. Scale is especially important to consider in the face of funding shortfalls. For example, guided by national research on academic recovery interventions (Carbonari et al., 2024), GCS made the decision to scale back high-dosage tutoring from being available to all students to supporting students with the greatest need in reading at the elementary level and math at the middle school level, ensuring a more efficient use of limited resources. Considering which populations are served by the initiative ensures vulnerable populations are prioritized.
  • Evidence: Initiatives may have varying levels of evidence available to understand impact. Some initiatives may be only monitored at the surface level by looking at high-stakes, end-of-year outcomes, while others have deeper process and causal analysis to understand impact. Consider how well the effect of the initiative is understood based on either internal or external evidence.
  • Effectiveness: Based on the evidence available, initiatives may show different levels of promise. Think of the “evidence” component of the rubric as what kind of data we have, while “effectiveness” reflects what the data is telling us. For example, in GCS, school attendance teams tracked evidence of outreach efforts through an outreach tracker app. As students were successfully contacted and returned to school, reductions in no-shows demonstrated effectiveness. District leaders cannot understand the effectiveness of an initiative without understanding the story its implementation data tells.
  • Alignment to priority or commitment: Initiatives that align to multiple priorities or commitments are more likely to gain traction than those that do not align with the strategic direction. In GCS, new initiatives are cross-walked with the strategic direction focus areas to ensure alignment.
It is not necessarily better for an initiative to score high in every category—some initiatives work best when specifically targeted rather than district-wide (e.g., summer learning and high-dosage tutoring). The goal of systematically reviewing initiatives using the prioritization rubric is to categorize initiatives to inform further strategic action. A clear picture of the evidence available (or not available) related to return on investment makes decision making in the face of initiative overload much more streamlined.
In Guilford County Schools, what began as over 120 unique initiatives was pared down to 51 priority efforts after applying the rubric to each initiative. The initial inventory process took about six weeks and faced its share of resistance. Some staff feared what sunsetting initiatives meant for their roles. Having collaborative discussions about initiatives using the rubric gave internal staff, community constituents, and senior leadership a common language and objective criteria to inform action, rather than basing decisions on subjective emotional attachments. While there were some natural fears about taking a more critical, data-based lens toward initiatives, this marked the beginning of a culture shift—surfacing the need for better evidence that our programs are having the intended impact.
The Initiative Prioritization Rubric helped GCS save nearly $750,000 in the first year alone, as a result of the elimination or consolidation of low-impact initiatives. Some of the largest cost savings were realized by segmenting high-dosage tutoring, streamlining summer learning, and bringing coaching for the implementation of high-quality math instructional materials in-house. Other initiatives were consolidated to reduce duplication of effort.

The rubric helped our district save nearly $750,000 in the first year alone, as a result of the elimination or consolidation of low-impact initiatives.

Author Image

Embracing Continuous Improvement

To get a better understanding of initiative impact, the district began building the capacity of staff to self-monitor progress of initiatives. All 51 initiatives are now supported by cross-functional initiative teams of school and district leaders that were trained to develop a theory of change, logic model, and progress monitoring plan. Progress monitoring has become like a GPS, where real-time data updates guide improvements during implementation, rather than after the fact. To support systemic sustainability, initiatives are now reviewed during strategic direction refresh cycles. Cabinet leadership also reviews new initiatives to prevent initiative creep.
Our initiative teams use After Action Reviews biannually to inform continuous improvement, grounded in four simple questions:
  • What happened?
  • What was supposed to happen?
  • What worked well and why?
  • What should be improved and how?
Protecting time for structured, data-based reflection conversations is helping teams capture lessons learned and inform future activities. Additionally, initiative leads participate in monthly professional development to build implementation capacity, ensuring efforts are backed up with support.
Embracing structured processes for regularly reviewing the impact of initiatives can prevent overload before it begins. By intentionally pausing to ask what’s working and why, school districts can trade overload for impact and restore clarity to their strategic direction. Less, when guided by evidence, truly becomes more.

Reflect & Discuss

  • If you asked five different staff members to name your district’s top three priorities, would they give the same answer? If not, what does that tell you about initiative clarity and coherence?

  • For each current initiative in your district, can you articulate what success looks like and point to specific evidence that it’s working? Which initiatives would you struggle to defend with data?

  • What would happen if you paused all new initiatives in your school or district for six months and focused solely on monitoring the impact of existing efforts? What might you learn about your capacity and the effectiveness of your current work?

 

References

Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. G. (2015). Learning to improve: How America’s schools can get better at getting better. Harvard Education Press.

Carbonari, M. V., DeArmond, M., Dewey, D., Dizon-Ross, E., Goldhaber, D., Kane, T. J., et al. (2024 July). Impacts of academic recovery interventions on student achievement in 2022-23. CALDER & AIR.

Kim, J. J-H., Newell, A., & Choi, K. (2023). Strategic budgeting: DMGroup’s approach to effective resource allocation. District Management Journal, 32, 14–29.

NIRN & SISEP. (n.d. - a) Initiative inventory process tool. University of North Carolina at Chapel Hill. https://implementation.fpg.unc.edu/wp-content/uploads/NIRN-Initiative-Inventory-Process-Tool_9-16-20-4.docx

NIRN & SISEP. (n.d. - b). The active ­implementation hub. University of North Carolina at Chapel Hill. http/plementation.fpg.unc.edu/

Erin Philip is a data strategist and alumna of the Harvard Strategic Data Project Fellowship.

Learn More



ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Related Articles
View all
undefined
School & District Leadership
When Enough Is Enough
Sarah McKibben
17 hours ago

undefined
Necessary, Nice, or Needless?
Teresa D. Hill
17 hours ago

undefined
The Trouble with Compensatory Programs
Lauren Porosoff
17 hours ago

undefined
EL Takeaways
Educational Leadership Staff
17 hours ago

undefined
Step Out of the “Fix-It” Reflex
Elena Aguilar
17 hours ago
Related Articles
When Enough Is Enough
Sarah McKibben
17 hours ago

Necessary, Nice, or Needless?
Teresa D. Hill
17 hours ago

The Trouble with Compensatory Programs
Lauren Porosoff
17 hours ago

EL Takeaways
Educational Leadership Staff
17 hours ago

Step Out of the “Fix-It” Reflex
Elena Aguilar
17 hours ago
From our issue
Cover of Educational Leadership magazine showing an eraser erasing a scribble with the title “The Power of Less in Schools.”
The Power of Less in Schools
Go To Publication