.    Jim Honan is a highly regarded expert at Harvard  in nonprofit finance and governance, and we at ISAS were fortunate to have him present last week on the topic of the value proposition.  He gave a fast-moving talk, squeezing two or three days of information into an hour, to bring us up to speed on this important topic.   His slides are below, after the jump.   Below are my notes of Jim’s talk, with some links added.

What is the value proposition ? How do you know you are providing it?

Jim points out that it is great to be discussing this topic with CEOs/Heads, because some say that CEOs/Heads  are the living logo of the school’s value proposition– and we should understand in our bones these issues and how to articulate them.

Why now?  Growing interest in accountability among various internal and external constituents.  Heightened focus on cost, quality, efficiency, productivity, and outcomes assessment.  Our “New normal” economy demands of us we consider more carefully issues of financial sustainability.   Independent schools, and schools generally, should not feel picked upon or singled out on this issue– the trends here mirror developments in other sectors—higher ed, public k-12, libraries, international NGO’s, arts and culture, all of which are grappling with a fast-rising demand for value proposition accountability.

How might you assess measure your school’s mission and impact? The traditional ways all come quickly to mind:

  • Accreditation processes and self study
  • Post graduation experiences of your students  Can your students do what you proclaim to teach and how would you know?
  • Internal surveys studies program review, alumni surveys
  • Reports to trustees
  • Publications/website

These above are the “usual suspects.”

But these may not be enough in this day and age, and we may need at each of our institutions take the next step in evaluating and demonstrating our value proposition.     We don’t, however, have to reinvent the wheel or start from scratch.  Borrow, adopt, modify one or more the following popular new methodologies:

  • Accountability mapping: who cares about your mission/impact/performance and what do they care about?
  • Ratings and ranking schemes
  • Accountability ranking
  • United way performance
  • Baldrich award
  • ISO 9000
  • Balanced scorecard

These templates are available for schools to use as a tool for measure impact and assessment: The toolbox is full.

Before we look closely at three or four of these models, we have to ask the question: Do leaders and their trustees really want to have this conversation?  You’d be surprised how many leaders, or maybe you wouldn’t be surprised, who say if they find the opportunity to speak frankly, “I don’t really want to know.”  Indeed, you might find out stuff you really don’t want to know.  Don’t start this process unless you are ready and willing and able to grapple genuinely with the results.

Put your school at a center of a map—who cares about whether your org has a mission and an impact—and what do they care about?

Who cares:  Students, Alumni, Teachers, Community members, Parents?  What do they care about? Perhaps college placement, or athletic opportunities, or overall happiness, or loving school, or high test scores.  For each group, you ask the question—what do they care about and how do we know, what data do we have, about how we are doing.

When you do this, you recognize, sometimes more compellingly and clearly,  that often different constituents want different data for an org, and sometimes they are even conflicting.

From Mindtools.com:

Your boss
Shareholders
Government
Senior executives
Alliance partners
Trades associations
Your coworkers
Suppliers
The press
Your team
Lenders
Interest groups
Customers
Analysts
The public
Prospective customers
Future recruits
The community
Your family

One fascinating exercise, Honan suggested, was to have your Admin or leadership team meet and have everyone prepare an individual accountability map thoroughly, and then compare the results.  In this fascinating exercise, you will sometimes you  find important conflicts requiring resolution within your organization.

  • Logic model

A program logic is a picture of how your program works—the theory and assumptions underlying the program.  This model provides a road map of your program, highlighting how it is expected to work, what activities need to come before others, and how desired outcomes are achieved.

Kellogg Foundation Foundation evaluation question: do we achieve our intended outcomes with our resources.

A program logic model is a picture of how your program works – the theory and assumptions underlying the program. …This model provides a road map of your program, highlighting how it is expected to work, what activities need to come before others, and how desired outcomes are achieved (p. 35) W.K. Kellogg Foundation Evaluation Handbook (1998)

An advantage of the logic model is that it is highly pictorial, and compels you to articulate your value in this succinct and comprehensible way. On one page, a picture, and it gets you to articulate the theories and assumptions under your work.   It also addresses or answers the “why” questions.

——-

Resources inputs/ activities/outputs/outcomes/impact.

One of the biggest issues in any of these models, and particularly the logic model, is that evaluating and assessing impact can be very hard— the timing becomes so important and the timeline can be so extended—sometimes years, sometimes decades.

Data availability and quality goes sharply down as the time horizon expands outwards— resulting in a very fuzzy impact.  Organizations in this circumstance become attached to their Cherished theory.  You think the magic is happening but you don’t have data to prove it—as the data horizon goes down and you have less and less data.   You hear all the time:  “YOU CAN’T MEASURE THIS!”   No, no, no, you would just know if it you saw it.

The lag effect is so significant and so problematic—by the time you get the results your ship has sailed.

One example of this is that if we were to evaluate the success of our programs by the success of our graduates, we might find that the program they enjoyed in the 1980’s was very successful– but does that mean we should not change learning now, when the world has changed so greatly?   We are changing things now to adjust to changing times even if our historical data is positive.

Colleges too are struggling to define and measure discernible learning outcomes.   So important! (See the great controversies growing about the role of the Collegiate Learning Assessment (CLA) that have been playing out the last 9 months in the pages of the Chronicle Of Higher Education.)

The logic model has great advantages in the questions it asks, and besides, even if you don’t like it, many, many foundations these days insist upon it— it is all the rage in foundation work– so you might as well step to it and being trying to think this way. This is what funders use for better or worse to judge the impact of their actions.

  • Does your planned work link to your intended results?
  • Can you measure the magic?
  • Does the magic really happen?

You can make the arrows go backwards too—start with impact and see what you can think through moving backwards.

This relates to the larger conversation globally about the social return on investment in social change: if you  back  out the random effects, with a sophisticated regression model, what do you find about many a social intervention?  That is does nothing:  Job training research:  After you back out the random effects, you didn’t have anything to do with getting a job.

Theory of action, theory of change.   Logic model—if we allocate these resources, and engage in these activities, then we hope, pray, that 3, 4, or 5 happen.

Theory of impact, walk me through the components of a logic model.   You will find things that don’t work in the logic model—that has no effect at all, in terms of data, we have no data to show.

Dashboards  

A comprehensive way to measure the effectiveness of a strategic plan.  Plug in at the back end of a strategic plan and ask are we doing what we are supposed to be doing.

Dashboards provide targets and tells CEO whether we are missing our targets:    Are your strategies aligned with your results—the targets make this clear.

Dashboards often employ a traffic light system:  red lights, green lights, yellow lights.    Good news about them is that they are easy to use and understand—bad news is if you have yellows you have to explain why you are having yellow lights.  They don’t get you to the why.   Like in a car—I know the check engine light went on—but I don’t know why or what I need to do about it.

NAIS has a new dashboard indicator—posted on its website. This is definitely worth checking out.  From the NAIS Website:

Schools spend a lot of time and energy ensuring that staff, board members, and others are kept up-to-date on the health of the school. Most of the key indicators of a school’s well-being are easy to identify, but can be hard to either track or summarize briefly. Add to this issue the many different types of individuals trying to ingest the same information and a school can spend multiple hours delivering the same data in a myriad of ways. The Trustee Dashboard is designed to pull all of the key areas together in one document, largely for trustee consumption. However, the resultant tool provides schools with the flexibility to use what they wish when they need it, with any number of stakeholders.

——

Jim’s concluding thoughts:

Challenges for leaders

  • Hard part—when you start this conversation there is a lack of  shared view of what impact means?
  • Ask board members if we were wildly successful what would that look like?  Likely get a widely divergent set of answers.
  • Data is always imperfect!
  • Models and mapping inevitably reduce complexity, and this can be very problematic, particularly when much of our leadership and tough decision-making doesn’t live in the glaring bright sunlight  of clarity, but in the nuances, shadows, penumbras of complexity.
  • Mismatch of expectations and perspectives.
  • Start small—unit of analysis.
  • Resistance to change and fear of the unknown abound in these exercises: Don’t do this exercise just to demonstrate how well you are doing.

Opportunities:

  • People want this!  Capitalize on current urgency.
  • Lots of tools available.
  • Can help you connect the dots—link planning and budgeting.
  • Clarify mission and strategies: difficult choices
  • Respond to concerns of funders, boards, media.
  • Position the organization for the future.
  • Horizon issues—more refinement happening/coming all the time, and we can remedy some of the flaws with oncoming tools.
  • Funding more outcomes oriented
  • Leaders—more accountable for outcomes and impact
  • What degree of change and transformation will be necessary?