This week, Leapfrog attended a workshop organised by IRISS (Institute for Research and Innovation Social Services) on using tools in co-design process. The event mainly looked at various tools developed and tested by Iriss with their partners as part of their co-design projects ‘Hospital to Home’ and ‘Pilotlight’.
Iriss use Double Diamond framework for their design process which is divide into four phases namely, Discover, Define, Develop and Deliver. For those who are not aware of the framework, Double Diamond was initially coined by the Design Council, the UK in 2005 as a graphical way to describe the design process. The framework maps the divergent and convergent stages of the design process, showing the different modes of thinking that designers use. Iriss tools follow the Double Diamond research approach and each tool fall within one or more of the four phases.
Iriss tools look at some key aspects in co-design such as securing engagement, creating an inclusive environment, shifting the balance of power and visualising ideas. As a part of their ice breaker we were asked to identify key challenges we face in using co-design process. From discussions with the attendees it was identified there were 4 main challenges:
- Gathering evidence – What difference does this make? How can I gather evidence for this work?
- How do we make people feel appreciated in co-design?
- How do tools (Iriss tools) tie together and What is the context? and
- What capacity does the organisation have or should have to do co-design?
I would like to elaborate the discussion on the ‘Gathering evidence’ as I was involved in that particular group and the topic was significant with the attendees. The group identified the significance of the topic as they face the question in their on going projects with community engagement frequently. Also, there were discussions that Scottish Government is also keen to gather this within the organisations in Scotland. The group agreed that evaluating the outcome throughout the process is important and should be done in projects rather than at the end. Hypothesising possible outcomes from the start can also help in evidencing the impact. Involving people in the evaluation process rigorously could be proven significant. The group made an interesting point that the impact should be measured in a longer period of time. As a group, we also talked about the ‘contribution analysis’ as an approach can help on accessing the outcomes.
This particular topic was important to me as we at GSA are about to start our Short project with Evaluation Support Scotland. We will be looking at developing tools that will help you gather evidence from data. So, watch this space.