Qualitative evaluation aims at a deeper understanding of the behavior of program actors (both staff and participants) and the thinking behind their practices. Its focus is not only about WHETHER a program resulted in change but also HOW and WHY that change came about, if it did. Sometimes, however, the conclusions drawn from qualitative approaches are criticized as non-representative, overly subjective and not replicable. The collection of material evidence of program activities– meeting agendas, attendance lists, published announcements and posters, existing status reports, working papers, staff assignments, job descriptions, staff resumes, planning documents, and other program records— mitigates against this criticism by providing an objective record of what happened inside the organization on the way to meeting program goals. It’s as simple as asking respondents to SHOW the evaluation team implementation-related activities rather than asking them to TELL the team about them. Although practically very different, such methods are in many ways akin to social media metrics that trace consumer online clicks as behavioral evidence of interest in a web site’s offerings. [Read more…]
In our last post, we talked about how to word open-ended questions. Why is this important? Because if done correctly, you can capture rich qualitative data from people using less expensive survey methods. We also talked about answer piping– taking the responses from one question and porting them into the text of another question. The value of answer piping is that it allows each survey participant to answer questions that are personally relevant to them which, crucially, engages them and inspires rich reflective responses. It looks something like this:
Now imagine something even cooler.
We were recently asked to design a study that involved following a group of 90 professionals over time as they developed a new approach to working with their clients. As with any new program, we anticipated that they would, at least at first, experience challenges implementing the new approach. We were interested in learning whether and how, over the course of the program, they would deal with these challenges. With 90 participants, it wasn’t possible to conduct multiple interviews with everyone, yet we wanted to understand how each person addressed the issues he or she faced. We could have asked each person, at the end of the project, what challenges they anticipated at the start, but over the 18 months it was active we were concerned that they would not accurately recall their initial concerns. How can you make this work in a survey? [Read more…]
A key challenge to any study that relies on surveys is that they typically offer mostly closed-ended, forced choice options to participants. Unlike qualitative approaches, surveys often frame questions in terms of the response categories conceived by the survey’s designers. Rather than asking “what did you experience, how did you feel?”, questions take the form “did you experience this or this or this, or did you feel this or this or this?” While a good survey can cover much of what respondents are likely to report, the richness of their personal voices is lost. Qualitative approaches however are very costly to implement typically limiting the number of program participants who can be interviewed. This in turn jeopardizes the representativeness of the study.
To address these issues, we advocate an approach that integrates carefully worded open-ended questions into the surveys we conduct. In our experience, well designed open-ended questions can garner the same kind of rich response typically found in interview-based studies.The key is move beyond open-ended questions like this one:
“Is there anything else you would like to tell us about your experiences?”
Survey participants need more detailed instructions if they are to provide responses that begin to approach what you can get from an interview. They need to see exactly what you’re looking for. In one recent study we used a fairly simple prompt; “what about the leadership training session, the program staff, the space, the agenda, or something else, most supported your learning?” in place of the generic example above. In another project, we provided even more instruction:
“How has your experience in the program so far made a difference in your work? What actions have you taken to implement what you have learned? What new insights have you gained? What practices have you developed that have been particularly relevant or meaningful to you?”
This prompt yield responses with an average of 138 words which is more than the first paragraph of this post. More importantly, the responses were thoughtful and directly germane to the evaluation questions the client was seeking to answer. [Read more…]
Several months ago we wrote about post-training surveys. You know, those surveys you get following a training session that ask about your experience. We’ve had several requests for more information on the topic and have decided to provide a sample survey you can download to use as a guide. The survey was created for a client that received a grant to provide basic, legal services to poor New Yorkers living with mental illness. It’s underlying goal was to advocate for individuals facing administrative proceedings aimed at reducing or eliminating their public benefits. Without such benefits, the organization’s clients would have been at risk for losing their homes. One facet of the program was the provision of training for mental health advocates designed to help them help those they worked with successfully navigate and prevail in the administrative hearings process.
There are two key issues in a post training survey, how did participants respond to the trainer, and what did they take away from their experience. The former is pretty straightforward. If you download the sample survey, you’ll find items that are usable in several different types of training programs. Assessing the takeaways from training is a bit more challenging. In the sample survey we kept it simple, asking whether participants felt they would be able to help clients navigate the benefits hearing process. Of course it’s possible they knew how to do this beforehand but earlier in the survey we asked about the overall appropriateness of the training. [Read more…]