A key challenge to any study that relies on surveys is that they typically offer mostly closed-ended, forced choice options to participants. Unlike qualitative approaches, surveys often frame questions in terms of the response categories conceived by the survey’s designers. Rather than asking “what did you experience, how did you feel?”, questions take the form “did you experience this or this or this, or did you feel this or this or this?” While a good survey can cover much of what respondents are likely to report, the richness of their personal voices is lost. Qualitative approaches however are very costly to implement typically limiting the number of program participants who can be interviewed. This in turn jeopardizes the representativeness of the study.
To address these issues, we advocate an approach that integrates carefully worded open-ended questions into the surveys we conduct. In our experience, well designed open-ended questions can garner the same kind of rich response typically found in interview-based studies.The key is move beyond open-ended questions like this one:
“Is there anything else you would like to tell us about your experiences?”
Survey participants need more detailed instructions if they are to provide responses that begin to approach what you can get from an interview. They need to see exactly what you’re looking for. In one recent study we used a fairly simple prompt; “what about the leadership training session, the program staff, the space, the agenda, or something else, most supported your learning?” in place of the generic example above. In another project, we provided even more instruction:
“How has your experience in the program so far made a difference in your work? What actions have you taken to implement what you have learned? What new insights have you gained? What practices have you developed that have been particularly relevant or meaningful to you?”
This prompt yield responses with an average of 138 words which is more than the first paragraph of this post. More importantly, the responses were thoughtful and directly germane to the evaluation questions the client was seeking to answer.
One technique that takes advantage of web survey technology pipes responses to closed-ended questions into follow-up open-ended ones. Imagine a leadership program in which participants are asked to develop a change project of some kind to demonstrate that they had learned a new leadership skill. A follow-up evaluation question might read; “Which of the following was your greatest challenge in implementing your change project?” For the sake of brevity let’s assume there are four possible answer choices A, B, C, and D. Of course such a question ought itself be set up as open-ended, but we’ll deal with that in a moment. The respondent chooses C as her answer and continues with the survey. Except that the next question that appears is an open-ended one that reads:
“Earlier you indicated that C represented a big challenge for you, could you tell us a little bit about what made C a challenge and how you dealt with it?”
Depending on the survey tool you use, this may be called answer piping or text substitution or something else. We have found that such an approach creates an interview-like experience for participants who respond in the kind of detail one usually finds only in a qualitative study. We can do something even cooler in a survey that uses a pre-post design. Stay tuned for a post on how that works.