Why Evaluate

September 30, 2014

We recently created a short slideshare that describes why nonprofit organizations should evaluate their programs. Beyond simply reporting program outcomes to funders, the slideshare makes a case for using evaluation for program improvement. It’s short and to the point and we hope you’ll take a look:


Let us know what you think!


Two Upcoming Workshops

August 28, 2014

The Support Center for Nonprofit Management will be hosting us on September 4th for two evaluation related workshops. The morning session is all about survey design. In it, we’ll cover strategies for increasing response rate, always a challenge, as well as techniques for writing valid survey questions. The framework we’ll use is Don Dillman’s Total Design Method which is relevant to both paper and pencil and web based surveys. If you need to survey your participants, visitors, staff, donors or your community, you won’t want to miss this brief three hour introduction.

OK, you followed our suggestions in the survey workshop and now you’ve got a pile of data to analyze. How do you turn this data into usable knowledge? Or, you and your team just conducted twenty-five client interviews. You’ve got pages and pages of notes from each one. How do you summarize what you’ve learned? How do you identify common themes in what your respondents told you? Answering these questions is what our afternoon workshop is all about. We’ll show you how to analyze data in such a way that will enable you to report what you’ve learned to stakeholders and advance your organization’s goals.

To see the outlines for the two sessions click here.
To download the demo data for the data analysis workshop click here
To register, click here.


The Value of Material Evidence in Program Evaluation

March 23, 2014

Qualitative evaluation aims at a deeper understanding of the behavior of program actors (both staff and participants) and the thinking behind their practices. Its focus is not only about WHETHER a program resulted in change but also HOW and WHY that change came about, if it did. Sometimes, however, the conclusions drawn from qualitative approaches […]

Read the full article →

Five Strategies for Integrating Data Collection into Your Work

March 21, 2014

Capturing data for a program evaluation is often done outside of a program’s normal processes. Evaluators parachute in with the various data collection tools they need to gather information about your program and in the process often interrupt normal workflows. If staff have to manage distributing and collecting various surveys and other forms it can add significantly […]

Read the full article →

Open-Ended Questions in Surveys Part 2

March 6, 2014

In our last post, we talked about how to word open-ended questions. Why is this important? Because if done correctly, you can capture rich qualitative data from people using less expensive survey methods. We also talked about answer piping– taking the responses from one question and porting them into the text of another question. The […]

Read the full article →

Open-ended Questions on Surveys Part 1

February 13, 2014

The Problem A key challenge to any study that relies on surveys is that they typically offer mostly closed-ended, forced choice options to participants. Unlike qualitative approaches, surveys often frame questions in terms of the response categories conceived by the survey’s designers. Rather than asking “what did you experience, how did you feel?”, questions take […]

Read the full article →

A Sample Post-Training Survey

January 23, 2014

Several months ago we wrote about post-training surveys. You know, those surveys you get following a training session that ask about your experience. We’ve had several requests for more information on the topic and have decided to provide a sample survey you can download to use as a guide. The survey was created for a […]

Read the full article →

How to Prove that Your Program Works

September 5, 2013

You may not be able to. A rather extreme statement, particularly from an organization that does program evaluation.  Nonetheless, we stand by it.  It’s hard to get too far into a discussion of the notion of proof without talking at least a bit about Karl Popper’s philosophy of science and his key idea of falsifiability. […]

Read the full article →

Questioning the External Focus of Outcome Evaluation

July 27, 2013

We came across an opinion piece in  in the Chronicle of Philanthropy by Kelly Campbell and Matt Forti of the Bridgespan Group in which they make a number of arguments for conducting rigorous outcome evaluations of nonprofit programs. While the piece concludes with a statement about the value of ongoing assessment in the service of continuous program […]

Read the full article →

An Alternative to the Evaluation RFP Process

July 6, 2013

This post is the last in our series about hiring an evaluation firm. Just to recap, we’ve suggested in our previous posts that while technical expertise is absolutely critical to a successful evaluation, soft skills, including flexibility and a strong client services orientation can make or break a project. (See our White Paper on Hiring an Evaluation Consultant […]

Read the full article →