In Leapfrog we develop tools that enable new ways for people to engage over a range of issues from design of services to local development plans. Our hope is that our tools not only improve the quality of outcomes for the people who use them, but in doing so they encourage people to foster innovative and creative participation processes that can make a significant impact beyond the outcomes themselves. Therefore, as well as co-​designing useful tools, we are exploring and researching what added value co-​design brings. In this sense, it is important to us to understand and evidence the wider impact that co-​design has.

Capturing that impact is a challenge because it is difficult to assess immediately. It takes time to perceive some of the effects of co-​design and therefore evidence it although it is something that begins right from the outset of the process. Another challenging aspect is that sometimes impact can emerge on the periphery of a project, instead of at the core, increasing the difficulty in identifying and articulating it. With this in mind, Leapfrog believes that identifying and evaluating impact is a powerful tool that, when embedded in the design process, can prompt, inform and boost collective change, as well as being able to create accountability and feedback. To support our research in this area we have developed an evaluation framework drawn on three levels of impact: difference in outcomes, difference in the process, and learning for the Leapfrog project. Each level pays attention to different factors that are at play at different stages of the project, in order to ensure a systematic gathering of the relevant data.

To help us now apply this frame in our work we have designed an evaluation postcard that seeks to gather information based on the thoughts and reflections of the people engaged in our workshops. The postcard consists of a folding card that contains three sections in which we ask three questions that refer to the evaluation framework. In addition another section is detached and given to the participants as recognition of their commitment, and as a symbolic gesture between the information provided to us and the possibility of following our research or getting engaged at all times.

The first question focuses on the difference in the process by capturing the changes that this approach produces and how engagement in a collective activity changes the way that participants think. The results from an initial use of the tool in one of our main projects, Peer-​To-​Peer Engagement, show that a co-​design workshop provides participants with a space that invites them to reflect collectively on their shared challenges and issues. The identification of issues helps them (and also us) to bring clarity into their everyday work and creates awareness. Additionally, it is able to change the participants’ perceptions about community engagement, usually seen as a tedious task, and turn it into an enjoyable activity. This, indirectly, leads to capturing differences in behaviour and attitudes, which is an aspect of the second level in the evaluation framework. The second question focuses on the difference in the outcomes and aims to identify surprising effects. In this sense, a positive, surprising outcome was that participants started valuing their local resources and seeing potential in them. The last question focuses on identifying emergent effects such as enhancing networks, however this one has not gathered substantial feedback. At the same time, all the questions inform the third level of the evaluation framework, which consists of an internal assessment in which Leapfrog is learning as the research progresses.

The tool worked quite well and allowed us to collect the participants’ impressions once the workshop ended. Participants were keen to fill the questions and this made us being able to gather key qualitative data to start tracking where and how the impact happens. In addition, participants liked the idea of receiving a postcard as a reminder of their experience. However, there is still scope to improve the tool and we are already working on a new version that is able to capture the third question with the same quality as the other two. Another observation is that when introducing the tool it is important to give participants time to reflect and respond adequately.