Gdynia Design Days

As a part of a collaboration between ImaginationLancaster and the World Design Foundation, Leon Cruickshank and David Perez facilitated a creative evaluation workshop with 15 participants of the Gdynia Design Days (GDD) in the city of Gdynia, Poland. During the workshop, they tested a framework developed in the last Milan Design Week (MDW). The framework focuses on the creation of creative methods to collect data from large events such as design festivals.

The framework has three categories:

  • Evaluation aims: Starting with the aim behind the evaluation (i.e. evaluating the number of participants, experiences, change or impact)
  • Evaluation strategy: Defining a basic strategy to collect data (i.e. collecting data before and after the event, tracking journeys, assigning roles, planning an itinerary)
  • Types of action: Selecting a mechanism to collect data (i.e. sending messages, collecting things, making marks or asking participants to identify themselves)

At the end of the workshop, four creative tools to evaluate the design initiatives such as festivals or workshops were created. For instance, using stickers with stars to evaluate the experience of young people in festivals, or nominating ambassadors among festival visitors and giving them a diary to describe their experience of events.

Participants commented that the workshop helped them to think about evaluation in a different way. The creative but structured approach of the session helped them to think about the feedback received during evaluation practices and to take actions about them.

The outcomes of these workshops will be shared during the next Dutch Design Week in Eindhoven, the Netherlands.

 

Rethinking R&D with the V&A Learning and National Programmes

Leapfrog’s collaboration V&A Learning and National Programmes continued with a third workshop in 2019, shifting focus from evaluation approaches to consider complete cycles of R&D within the team. We found that evaluating events can lead to one-​off questions that do little more than give you a thumbs up or down on the venue and timings. Connecting together the data and insights from evaluative work is the key to at iterative R&D approach – finding ways to build up understanding of an audience and the impact the team’s work has on them.

Together we tested a new co-​designed tool to enabling teams to map, compare and iterate the cycles of R&D they conduct. The cycle begins be defining aims, then planning data collection and analysis before concluding by considering how the insights draw from the cycle will be implemented and shared within and beyond an organisation. The cycle structure is complemented by the ‘skeleton’ tools, helpful as starting points for planning discussions and reflections for each stage of the cycle.

The result of Leapfrog’s work with V&A Learning and National Programmes is not only the tools we’ve co-​designed together, but an idea for how any team can transform their approach to understanding their audience. In a year of working with Leapfrog, the Learning and National Programmes team was able to better integrate R&D cycles in their work to ensure that they are audience-​led and continually iterating their offer, designing and developing programmes that are exciting, relevant and rooted in the V&A collections and backed up by robust data.

Exploring Evaluation Pathways with V&A Learning and National Programmes

In the second workshop of our collaboration with the V&A Learning and National Programmes team we looked in detail at the real-​world evaluation processes and pathways used by the team. These were visualised collaboratively, augmented with ‘skeleton’ tools co-​designed together previous, creating a rich picture of how the team currently capture the impact of their work, and how this could be enhanced further in the future through R&D.

Building on our previous collaborative workshop, we identified four basic strategies for capturing impact through evaluation:

  • Baseline & follow-​up – Multiple measurements showing change.
  • Self-​reflection – Asking someone to think back on what has changed.
  • Signals of change – A unique piece of data proving a change.
  • Expert observation – An expert observing and documenting change.

The ‘skeleton’ tools we’ve co-​designed together reflect our joint finding that good R&D requires more than effective methods and processes, it needs teams to creatively engage with the opportunities that research brings, and together develop new ways of working. By creating skeleton tools that are not complete solutions, we can help teams to imagine, plan and share new practice – collaborating to find the best way for particular projects, and building up shared knowledge of how best to approach new challenges in a creative, data-​driven and iterative way.

Evaluation & Impact Co-​Design with V&A Learning and National Programmes

Beginning in February 2019, Leapfrog researchers worked with the V&A Learning and National Programmes team to explore a new way of evaluating and enhancing their work – a radical shift from capturing who attended events to capturing the impact of the team’s work on their audience.

In our first workshop together, the Leapfrog team collaborated with V&A staff to interrogate their new framework for capturing the impact of their work. Together we mapped their work to 12 categories of learning, ranging from Critical Thinking through to Ludic and Embodied knowledge. Building on these insights we co-​designed over 30 concepts for new tools to creatively collect evaluation data that could reveal the learning of audience members.

The result are the ‘skeleton’ tool ideas to be published in a forthcoming toolbox. These are not concrete solutions to data collection but instead starting points for new ideas specialised to the particular context and circumstances of the many programmes the Learning and National Programmes team pursue.

Further workshops are planned for late Spring and Autumn 2019.

Scaling Up Leapfrog: Improving a million creative conversations

Leapfrog are delighted to announce we have received follow on funding from the AHRC for a one year follow on project called ‘Scaling Up Leapfrog: Improving a million creative conversations’. The original £1.2m Leapfrog project used co-​design to work with communities, NGOs and the public sector to develop and share new, more creative engagement tools and resources. These flexible resources help prompt and structure dialogue between the public sector and the people they represent; this could be to contribute to a major decision in the community (e.g. building a new playground) or participating in a policy consultation or helping young people talk about difficult subjects such as sexual health.

Leapfrog co-​designed with the public sector to produce over 50 new engagement tools. These tools have been downloaded over 2,000 times and used by public sector practitioners to engage many thousands of people, e.g. one of the 50 tools has 2,400 documented uses with participants of Health Watch activities in the NorthWest. Building on this tool use we want to take Leapfrog into new territory, focussing on how the Leapfrog tools can be adapted for application at large scales by working with new partners to co-​design and share new versions of key Leapfrog tools. 

The challenge is how to elicit (and make sense of) hundreds or even thousands of contributions when the majority of engagement practice to date has focused on small group interactions.

Bringing to bear our extensive experience of engagement tool co-​design we will be working with a new set of partners to adapt Leapfrog tools to meet this challenge. It will result in adapted tools that facilitate positive engagement with large groups of people with a diverse composition in an inclusive, enjoyable, productive manner. The new tools and toolkits for engaging at scale will be shared freely on www​.leapfrog​.tools

Collaboration and Key Partners

Scaling up Leapfrog uses the resources developed in the original Leapfrog project as the basis for new adaptations of these tools with new partners to reach a different set of beneficiaries at a far larger scale. We are working with 4 key partners in the follow-​on on project. Our partners are:

  • Morecambe Bay Clinical Commissioning Group (MBCCG) with supporting partner Lancaster Community and Voluntary Solutions (Lancaster CVS)
  • Food Power – Tackling food poverty through people powered change
  • The Victoria and Albert Museum (V&A)
  • The World Design Weeks Network

More details of projects with our new partners will be posted in future blogs. 

Leapfrog Evaluation Report

Evaluation has been a key element of the Leapfrog programme of research for the past 3 years, trying to capture the value of tools and the co-​design approach undertaken with the project partners. This is important not just to ensure we consolidate the value of co-​design tools and processes in such contexts, but to communicate and co-​determine the value for our partners and those interested in such processes. Leapfrog’s evaluation not only focused on measuring the final outcomes (did we, and our partners, achieve the outcomes we envisaged?), it also looked at which tools and approaches were most effective, what perhaps wasn’t effective, and in what ways they were effective. Leapfrog’s evaluation is also interested in the softer, more qualitative elements of change and learning, including the benefits of greater trust, collaboration in engagement and co-​creation. In order to assist with this approach, an evaluation framework was developed to help capture evidence to address the research objectives. This final evaluation report presents the process and findings for gathering evidence of impact during the Leapfrog project.

Leapfrog has revealed and provoked a huge demand for creative engagement through the delivery of its projects and various community events. The tools and toolboxes that have been co-​designed with partners have helped and continue to help a growing number of people, communities and contexts. There has been evidence of improved engagement for hard to reach communities, while many more people have now been receiving and using our tools, with much more opportunities to support shared learning. We have gone some way to evidencing the Leapfrog project using our evaluation framework, however stopped short of quantifying that impact. Dissemination events are still underway and download numbers continue to increase, with Leapfrog still therefore very much a live project. As such the impact from Leapfrog will continue go further and deeper with tools and toolboxes making a difference to engagement in communities and we continue to evaluate and evidence the longer-​term impact over the coming months.