Nonprofit, nonpartisan journalism. Supported by readers.


Nonprofits learning how to measure costs and benefits

As nonprofits face increased pressure from funders to show results, the latest buzzword is “return on investment.”

It’s more the language of a stockbroker than a human services manager. Return on investment refers to a hard-nosed economic analysis that shows politicians and grant reviewers that nonprofits can track their costs and benefits — and show in concrete ways that investments pay off for clients and the public.

It’s not a new concept, but one gaining currency among nonprofits. Consider that 200 public and nonprofit leaders showed up Tuesday for Wilder Foundation‘s one-day conference titled “Return on Investment: The Dollars and Cents of a Nonprofit Program’s Worth.”

St. Paul-based Wilder is developing a niche to provide return-on-investment research to nonprofits.

Keynote speaker Michael Stegman, director of policy and housing for the John D. and Catherine T. MacArthur Foundation, said that MacArthur has begun “intense conversations” with Wilder to put Minnesota’s supportive housing programs to a rigorous cost-benefit test.

Paul Anton, Wilder Research’s chief economist, told MinnPost that return on investment is a powerful technique and he is encouraging Wilder to create a center dedicated to that work.

“After seeing what is going on at the national level, it has increased our confidence that such a center could be viable,” said Anton, who was ambiguous on when the center might take shape. “We are somewhere between a gaseous cloud and a planet.”

Gaining momentum
Art Rolnick, Minneapolis Federal Reserve Bank senior vice president and research director, got policy makers’ attention in 2003 when he wrote about early childhood education’s “incredible return on investment.” (PDF)

Others are replicating that framework and language.

Joellen Gonder-Specek, executive director of the Minnesota Mentoring Partnership (MMP), said research has shown for a long time that mentoring improves kids’ lives. What the field lacked was a way to put that benefit in economic terms, she said.

“I was aware of the work that Art Rolnick had done,” she said, speaking at the conference. “I was envious of the fact that the folks in the early childhood area could take that information and make a compelling case.”

In March 2007, MMP and the Minnesota Youth Intervention Programs Association released a report on the social return on investment for youth programs. The study, completed by Anton and the University of Minnesota’s Judy Temple, concluded that quality mentoring returned $2.72 for every $1 invested, and quality youth intervention programs returned $4.89 return for every $1.

Some conference attendees were just getting introduced to the concept, a few with a nudge from funders. Kurt Wiger, Courage Center‘s coordinator of volunteers and interns, said one of the center’s corporate backers paid the conference fees for him and others to learn more about return on investment.

Cathy Hartle, the Initiative Foundation‘s senior program manager for organizational effectiveness, said that with Wilder’s help, the central Minnesota foundation planned to add a return-on-investment review to one of its funding areas — either economic development or healthy community partnerships.

Lessons from Washington state
One big challenge to cost-benefit analysis is putting a dollar value on benefits. Prevention programs could reduce future criminal justice or welfare costs. But it takes expertise to evaluate the research on effective programs, to quantify that benefit and then give it a cash value.

Many nonprofits don’t have the staff, money or expertise to do that kind of work. Yet as larger institutions pursue it, nonprofits and policymakers can crib.

One leader is the Washington State Institute for Public Policy (WSIPP), a nonpartisan, independent agency that works for Washington state government. (For the true number geeks, here is WSIPP’s 120-page technical analysis [PDF] for costs and benefits of youth prevention programs.)

WSIPP’s associate director Steve Aos, who spoke at Tuesday’s conference, said his state has made the most progress in criminal justice. The state scrapped plans to build a new prison because the research shows it could get a prison’s worth of crime reduction with less costly prevention programs.

Aos said state governments need something like a Consumer Reports agency to sort through the research because there are so many poorly done studies. Otherwise, organizational claims about having evidence-based programs will “just become a code word for ‘Give me more money.’ “

Conference attendee Angela Eilers is studying WSIPP and hopes to promote something similar here. Eilers, the research and policy director for the nonprofit Growth & Justice, recently was accepted to the Bush Leadership Fellows program to pursue the research.

Wilder Research plans to post conference material online soon.

You can also learn about all our free newsletter options.

No comments yet

Leave a Reply