Coro New York is a leading provider of leadership development training through its fellows program and through its work with emerging leaders in community development and education. Following a strategic planning process in 2011, the organization set community transformation as one of its desired long-term impacts. In 2013, Coro asked Usable Knowledge to conduct an evaluation of its Neighborhood Leadership Program (NLP) designed to improve the leadership capacity of local business improvement districts (BIDs) in the City of New York and to help BID staff manage their relationships with the city. NLP consisted of a series of trainings, held over a five to six month period. Half the trainings directly address leadership skills. The other provided an opportunity for participants to interact with and learn from city officials.
The evaluation had a number of parts. First, following each session, we conducted a survey that asked both about the usefulness of each component of the training and, through a series of open-ended questions, about key takeaways. At the conclusion of the program we conducted a series of focus groups with participants in order to understand how they were beginning to implement what they learned in their work. A final series of interviews was conducted six months after the program ended. These took the form of site visits where we had the chance to meet with participants in their offices. Here we heard about the action steps they had taken towards making improvements to their community’s infrastructure. Given the complexity of many of these projects we did not expect to see new streetscapes in place or totally revamped retail corridors. But we did see blue prints for such changes, hear about significant reductions in retail vacancy rates and came to understand how they were using the training they received to collaborate and partner with other community groups. Participants also told us that they continued to draw upon the expertise of their cohort, the wider community of Coro alumni and the other experts and officials they met during the training. We were also able to see that they kept the training materials close at hand and in one instance a participant even showed us that he kept handy a deck of cards he was given with leadership tips. (For more on material evidence in evaluation, see this post.)
The key in all of this is that beyond simply asking about impacts, this kind of interview approach allows the evaluation team to see impacts in action. Done correctly it makes it possible to tie those impacts to an intervention with much greater certainty than is possible with a quantitative study which is subject to all the limitations mentioned in our earlier posts. Even if resources are available for a quantitative study, the approach we advocate can help explain and most likely cement what the quantitative data shows. For this reason, we believe it to be the best practice approach.