Earlier this week, a group of researchers within the Institute of Design Innovation got together to discuss the significance of using tools effectively for engagement. In a creative space like InDI our approaches are diverse, but as researchers and designers we aspire to engage with people and understand more about their experiences on an empathetic and intellectual level. Through this discussion, our aim was to come together to explore tool design, the challenges we face in adapting tools and how we embed evaluation in tools.
The researchers introduced their respective tools to the group and explained their design approaches and project contexts. We talked about the role of tools in design research and how it has opened up new ways for us to engage with people. During the discussion it came to light that researchers faced a common issue when capturing the essence and spirit of the conversation during engagement with large groups.
The tools shared by the researchers were quite diverse. To name a few: Actor Network mapping tool — used to map experiences between individual and organisation, a Workshop Probe kit – a personalised pre packed toolkit to gain better understanding of remote communities, and Experience Guides – asset-based tools to allow participants to easily share practical strategies and advice in a range of contexts.
Whilst we exchanged the advantages of using tools, the conversation raised issues researchers faced in adapting certain tools to capture the essence of the engagement. When working with large groups for a certain period of time, the tools should be designed in such as way that it collects the key insights from the myriad of data gathered. This is something Leapfrog is trying to take on board while we co-design tools with our partners. However, it was suggested from past experiences that recording the sessions and transcribing the data could give the facilitators more in-depth understandings.
We discussed the importance of facilitation in delivering the tool. As an opportunity for building relationships, mutual learning, and gathering participants’ insights, design engagement can be enriched by the facilitator’s softer skills, such as empathic listening, and the choreography of a comfortable environment. Another challenge was to build “feedback loop” with participants, so we self evaluate our tools. In order to design an effective tool, feedback loop is vital, it helps to know what participants really think about the tools.
We briefly spoke about adapting and adopting tools to other research contexts. Even though the tools we shared may not be directly transferable, we considered how specific elements of their design may be adjusted. A good tool will be practical, engaging and at the same time it should also help people to adopt and adapt solutions to their local context.
The final topic for discussion was embedding evaluation in tool design. We all agreed it is a prevailing issue to evaluate tools and shared our experiences from previous projects. A non invasive approach, asking a simple set of questions such as: What worked?, What did not? And What could be changed? has worked well in the past.
This session opened up many interesting conversations around tools. Having these discussions was informative to Leapfrog as we develop our co-design approaches with our partners. This session also helped us to take a step back and reflect on the questions we ask, the materials we use, and the decisions we make when co-designing tools.