This last blog explores how to design a robust research methodology which can enable practitioners to run problem diagnostics and evidence impact for scalable change.
The ‘Methodology design’ section of our framework highlights our recommended phases for designing, implementing and scaling digital innovation projects.
Crafting a robust methodology
Investing in the design of a robust project methodology should be a top priority. For Traverse, a robust methodology in digital innovation relies on a discovery phase, aimed at assessing the viability of the project through diagnostics and public engagement, and a strong monitoring and evaluation practice, aimed at demonstrating impact for scalable change.
Building in time for a discovery phase is essential; the timeline and structure of the programme should be developed based on preliminary findings around the usability of the selected approach to ensure that it responds to a specific need and is likely to be adopted.
Without a discovery phase, projects risk investing in solutions that are not fit for purpose:
- The intervention might not be appropriate for the application originally identified
- The end users or clients might not be ready to engage with the approach
- Practitioners may be unwilling to shift their practices and might require motivations or incentives
- The team might encounter unanticipated technical issues.
As described above, problem diagnostics and a thorough process of public engagement can reduce the risk of designing a flawed solution.
But what we believe is the most important ingredient for a robust project methodology is a solid formative evaluation practice. Through the development of an outcomes framework at the outset of the project, the project team can agree on a set of outcomes and success indicators which can help monitor progress and ultimately measure the impact the system as a whole is having on service-users.
We often find that providers are very ambitious with what they want to achieve, yet lacking a clear focus and ways to channel their resources effectively. Setting realistic and measurable goals can be a way to maximise resources and generate robust evidence.
Digital projects, by being subject to constant changes in regulations, need to be reactive to system changes. Agile evaluation methodologies are able to adapt to changing priorities and needs. The main characteristic of agile evaluation frameworks is their flexible nature, which means they can shift to make the most of unforeseen opportunities and tackle unexpected challenges, while not compromising their rigour and objectivity.
Ultimately, a robust evaluation methodology can help you:
- Maximise and demonstrate impact for scalable change
- Develop a strong narrative to advocate for more funding
- Distil learning and identify key ingredients for success as well as the necessary conditions for change
- Learn from challenges your team may have encountered along the way and share best practice with fellow colleagues and institutions.