Making your Mediocre Strategy Matter:  Strategic Implementation

I’d much rather you implement a mediocre strategy well than a “perfect” strategy half-heartedly. 

This may seem like heresy in the modern era of research-based interventions and metanalyses that show effect sizes, but I’ve been saying this to district leaders for years, and I still believe it.   Recently, while facilitating a district back-to-school administrative retreat this August, I was asked to say more about the idea.  Since it is early September, and district and school leaders are busy launching improvement efforts, I thought I’d share my thinking with a broader audience.

At the Center, we work shoulder-to-shoulder with educators across the state and, occasionally, across the nation, on issues of large-scale improvement.  As we listen to professional educators, one of the most common sentiments we hear is “we are better at adopting new strategies than implementing them well.”  Sometimes “well” means implementing with fidelity, some consistency and alignment to the elements of the strategy as originally designed.  For others, “well” means seeing through the strategy long enough and well enough before changing courses to a new idea.

This often-heard sentiment about implementing well seems consistent with school reform research, which, in broad summary, suggests:

  • we regularly (and unknowingly) implement new improvement strategies in ways inconsistent with the original design
  • we implement strategies in ways that make changes, but often not in the classroom where they would matter the most
  • we sometimes interpret a lack of measurable results as a failure of the idea and then walk away from sound improvement approaches
  • we often lack the ability to accurately monitor progress in the early stages of implementation, especially before the ultimate results begin to show up in lagging indicators
  • we pay too little attention to how research-based interventions would need to be adjusted to local context
  • we unknowingly make numerous decisions and choices that reduce the overall power of the strategy

I’m sure there are numerous others I could list here, but you get the idea.

Now, back to the idea of a mediocre strategy.  Of course, all things being equal, I’d rather districts select smart strategies with a very strong evidence base, large effect sizes, and ample research on different educational contexts.  However, all things are often not equal, and as such, I’d rather bet on implementation.

A school district that can implement well can also learn from its implementation, identifying and overcoming obstacles, tailoring to its unique context, and regularly identifying its next level of work.  Such a district can grow its capacity for future efforts, even if a particular strategy didn’t generate optimal results.

The Center has designed a Strategic Implementation self-assessment, a tool that you and your colleagues can use to discuss your district’s capacity to move strategy to results.   This tool was first crafted for use with our Systemic Instructional Improvement Program Network (SIIP).  Since then, we’ve updated the tool based upon new insights.  We recommend that you complete the inventory individually as your team members do the same.  Then quickly capture all the scores before anyone explains, reducing the risk that team members will change their scores based upon the first few answers they hear.  We’ve included a score capture template if it is useful.  This template can easily be printed poster-sized at a copy shop, providing a visual reference during the debrief.

If there are elements you believe we’ve missed in our inventory, please let me know.

I hope you are all off to a great year.  Please follow me on Twitter at @richard_lemons.


Richard W. Lemons, EdD
Executive Director, CT Center for School Change
rlemons@ctschoolchange.org
 @Richard_Lemons

2019-09-10T16:11:25+00:00