Who We Are
Our Team
Our Work
Clear Horizon acknowledges Aboriginal and Torres Strait Islander people as the Traditional Custodians of the land and acknowledges and pays respect to their Elders, past and present.
Co-creator of Australia’s ‘game-changing’ Place-Based Evaluation Framework, Dr Jess Dart shares what these approaches are and why they’re the north star for community impact.
If this year has proven anything, it’s that we are capable of change and of rising to new challenges. In that spirit, we’re challenging ourselves to better walk the talk and build a liveable company. But what does that mean?
VicHealth partners with communities, government agencies and organisations across health, sport, arts, food, education, social sector, and the media to share expertise and insights and bring global best practice approaches to Victoria – to drive lasting, positive health and wellbeing outcomes for all Victorians.
We’ve all heard is takes a village to raise a child. But what does that look like in practice? Hands Up Mallee offers us a glimpse – bringing together the community, organisations and services to support children and families to thrive. Here’s how we’re helping them achieve greater community health and wellbeing outcomes.
What happens when Designers and Evaluators start cooking up social innovations together? Is it a case of the Cook and the Chef, where there’s collaboration throughout, or is it more My Kitchen Rules, with paring knives out? Here are four kitchen scenarios we’ve observed – but we want to know: which kitchen are you cooking in?
Evaluation Tools session, with Monique Perusco (Jesuit Social Services and Working Together in Willmot), Skye Trudgett and Ellise Barkley (Clear Horizon)
Our session started with a contextual summary of the work and characteristics of ‘Together in Willmot’, a collaborative social change effort in Mt Druitt involving The Hive, Jesuit Social Services, service providers, schools and many other partners. Clear Horizon is working with Together in Willmot as an evaluation partner. Our shared approach to learning and evaluation responds to the challenges of evaluating systems change and place-based approaches, and is tailored to the phase, pace and strengths of the collaboration. We introduced the process for evaluation we are undertaking, which has involved training in Most Significant Change Technique and local data collection which will feed into building a theory of change and then an evaluation plan. We are planning next year to do co-evaluation focused on the efforts and outcomes to date.
During the session we looked at examples of some Most Significant Change stories so far collected as part of this work.
Most Significant Change (MSC) technique was developed by Jess Dart and Rick Davies. Together Jess (Clear Horizon’s founder and CEO) and Rick authored the User Guide in 2005, and MSC is now applied to innumerable contexts worldwide. MSC story based method that can be used for participatory monitoring and evaluation. The process follows a simple interview structure that can generate a one page change story. It is participatory because many stakeholders are involved both in deciding the sorts of change to be recorded, and in analysing the data. MSC uses stories as a form of data collection. Stories are collected from those most directly involved, such as project participants and field staff. Stories are usually collected by asking a simple question such as ‘during the past year, what in your opinion, has the been the most significant change for participants as a result of this program? Stories are then collected, and stakeholders sit together to analyse the stories, at this time, participants are asked to select the story that represents the most significant change for them. The process of selecting the most significant story allows for dialogue from project stakeholders about what is most important. This dialogue is then used as evaluation data to create knowledge about the project and what it is achieving.
We also covered concepts and tools for evaluating systems change and place-based approaches from the Place-based Evaluation Framework and Place-based Evaluation Toolkit, which was commissioned by Commonwealth and Queensland governments last year and is a leading guide for evaluation in this context. We introduced the generic theory of change for place-based approaches and ‘the concept cube’ that shows the multiple dimensions of evaluation in this context. Clear Horizon worked with TACSI and CSIA to lead the co-design of the framework and have been working with government, community, philanthropy and non-government partners to test, apply and progress these learning, measurement and evaluation approaches.
At Clear Horizon, we have been grappling with how to effectively – and efficiently – improve the monitoring, evaluation and learning of programmes. Over the many years of experience, and across the range of programmes and partners we work with, one thing remains abundantly clear: the quality of the monitoring is the cornerstone for effective evaluation, learning and programme effectiveness. In the international development sector, which has some quite large investments that operate in extremely complex environments, monitoring remains even more important.
At the end of 2017, Byron’s new year’s resolution for 2018 was to “dial M for monitoring”, and to put even more emphasis on improved monitoring systems. Having conducted stocktakes of MEL systems across a range of aid portfolios, and being involved in implementing or quality assuring over 60 Department of Foreign Affairs and Trade aid investments, really clear messages about what works and what doesn’t have emerged. This culminated in the presentations at the 2018 Australian Aid Conference and 2018 Australian Evaluation Conference, where Byron and Damien presented on how to improve learning and adaptation in complex programmes by using rigorous evidence generated from the monitoring and evaluation systems.
So we at Clear Horizon welcome the findings and recommendations in DFAT’s Office of Development Effectiveness Evaluation of DFAT Investment Monitoring Systems 2018. Firstly, we welcome the emphasis on improved monitoring systems for investments – it is essential to improving aid effectiveness. Secondly, we strongly agree that higher quality MEL systems are outcome focused, have strong quality assurance of data and evidence, and where the data services multiple purposes (i.e. accountability, improvement, knowledge generation). Thirdly, that partners and stakeholders that have a culture of performance oversight and improvement are essential – this needs to continue to be fostered both internally and externally.
To achieve this, as recommended, it is essential that technical advice and support is provided to programme teams, investment managers, and decision makers. This need not be resource intensive, and must be able to demonstrate its own value for money. However, what is extremely important in this recommendation is that the advice is coherent, consistent and context specific. Too often we see a dependency on the programme team providing a singular generalist M&E person required to provide a gamut of advice – covering a range of monitoring approaches, evaluation approaches, different sectors, and sometimes even different countries. Good independent advice often requires a range of people providing input on different aspects of monitoring, evaluation and learning – a reason at Clear Horizon why we have a panel of MEL specialists, with some focusing on evaluation capacity building, others on conducting independent evaluations, or those building MEL systems.
Standardising expectations and advice across aid portfolios of what constitutes good monitoring, evaluation and learning that is fit for purpose is essential for all of us. We have been fortunate enough to be involved with developing different models of providing third party embedded design, monitoring and evaluation advice. The ‘Quality and Improvement System Support’ approach provides consistent technical advice across entire aid portfolios, such as what has been developed for Indonesia; ‘Monitoring and Evaluation House’ in Timor Leste in partnership with GHD is based on a neutral broker approach to improving the use of evidence in programme performance; and the ‘Monitoring and Evaluation Technical Advisory Role’ in Myanmar places a stronger emphasis on supporting programme teams through technical and management support.
This report echoes our belief that more monitoring and evaluation is not necessarily the answer, but rather collaborating to do it better and breeding a culture of performance is ultimately what we are striving for.