Seven tips for place-based evaluation
Written by: Mathilde Wennevold
Evaluating place-based approaches is never easy. These initiatives aim to address the needs and goals of communities and environments, often dealing with complex, long-term issues that lack straightforward solutions. They involve transforming deep systemic conditions while meeting local needs, and there is no one-size-fits-all methodology for evaluating them.
In this session of the Ripple Effect webinar series, Ellise Barkley, host and Head of the Clear Horizon Academy, invited three experienced practitioners to reflect on what works (and what doesn’t) when evaluating place-based approaches. She was joined by Sharmay Brierley, a proud Yuin woman and Project Lead at Kowa Collaboration; Rodney Greene, an experienced leader in community-led systems change with Collaboration for Impact; and Melinda Chiment, principal consultant here at Clear Horizon, specialising in participatory methods and strategic planning. Together, they shared practical tips based on years of experience. Here are the highlights.
You can watch the full webinar recording here.
1. Be in place
Sharmay reminded us that nothing substitutes for showing up. Being physically present allows evaluators to notice the subtle data missed by surveys or online methods. It’s the feel of a room, the rhythms of a community, the unspoken context.
[Being in place] “is really beneficial because it allows us to observe and collect data about what we’re seeing and feeling while being present and grounded in community. It also allows evaluators to deepen their understanding of the local context. It also provides insight into the social, cultural, and environmental factors that influence outcomes.”
In-person engagement is especially important early on, when relationships are being built. It sets the tone for a relational rather than extractive evaluation.
2. Centre community voice
Too often, evaluation data about Aboriginal and Torres Strait Islander peoples is deficit-focused and misrepresentative. Sharmay emphasised that evaluation must actively centre community voice – not just in findings, but in design and delivery.
That means involving the community in co-creating logic models, shaping evaluation questions, testing data collection tools, defining what good looks like, and sense-checking findings. It also means recognising community members as holders of expertise, not just participants.
“Community really holds the expertise and understands what’s needed. They also foster the relationships that support a successful evaluation.”
3. Ground your work in shared principles
For Melinda, principles serve as a North Star when navigating the complexities of place-based work. At Our Town, a decade-long initiative in South Australia, four straightforward principles help keep the work on track.
- Community-led
- Modelling mentally healthy practice
- Learning our way through change
- Seeing and acting in systems
It’s better to have four simple principles that everyone remembers than ten that no one uses. Principles don’t tell us what to do, but they guide us on how to go about doing it. These principles underpin practice, offer clarity amid uncertainty, and, importantly, form part of the evaluation itself. Yo ucan read more about the Our Town principles here.
4. Use co-design as a trust-building tool
Evaluation carries baggage. Communities often link it with extractive practices and negative experiences. Melinda mentioned using co-design and collaboration to enhance impact and foster trust.
In Our Town, co-designed Theories of Change developed into shared outcomes and rubrics across six towns. Along the way, tensions emerged, such as community concerns that population-level indicators didn’t reflect a strengths-based approach. Instead of pushing on, the team paused and remained curious.
“Not every project has the ability to hit pause like we did, but I’d invite you to think about and take stock of these things, and ask: where can you compromise, where might be there be wiggle room to codesign, where can you follow the community’s lead, and where can you come around the table and build something together?” Melinda said.
5. Evaluate the strength of networks and relationships
Rodney warned that evaluators often concentrate on programs and activities but tend to overlook what makes place-based work unique: community-led networks, trust, and relationships.
“If I were going into a community, I’d want to explore the strength of networks. Are people just turning up to monthly meetings, or is there a culture of collaboration happening daily?”
He shared how employment agencies in Burnie Works shifted from competition to real collaboration, discussing before and after meetings how to solve client issues together. That culture of working together became the foundation for everything else.
6. Look for links between local and broader systems
Place-based initiatives don’t exist in isolation. Evaluators need to notice when community efforts ripple into broader systems.
Rodney shared a simple but telling example from Tasmania’s COVID-19 response: parents in a Connected Beginnings community couldn’t access RAT tests. A government representative in the weekly working group heard this and swiftly escalated it. Soon after, guidance was sent to all Tasmanian schools: no rationing needed; families should receive what they require.
That moment of connection between lived experience and government systems was made possible by strong relationships already established.
7. Foster a culture of shared learning
Ellise highlighted a key theme across all the examples shared: the significance of a learning culture. Evaluation isn’t just about measuring and tracking progress, but it’s also about fostering shared learning experiences among all partners, both in short cycles (week to week) and long cycles (year to year). This continuous learning fosters adaptation and maintains the work’s relevance in dynamic systems.
Moving forward
As Sharmay noted, genuine place-based evaluation is challenging. Time is always limited, and relational work rarely fits tidy project schedules. But as all three speakers emphasised, it’s the relational, systemic, and community-led parts that matter most. For evaluators and changemakers, the message is clear: look beyond activities and outputs. Focus on presence, voice, principles, trust, relationships, and system connections. Foster conditions where communities are not just consulted but empowered to lead.
Ellise closed the session, reflecting:
“In 2025, there is no greater time for us all to keep leaning into improving our practice in this work and enjoying the work by learning from others.”
So, next time you work with the community and evaluate or facilitate collective learning for place-based approaches, remember these seven tips.
Interested in learning how to put these tips into practice?
Clear Horizon Academy have a suite of courses designed for individuals and groups involved in emerging and dynamic initiatives, helping to navigate complexity with confidence and clarity, including:
- Evaluating place-based approaches – a course covering principles, design, and systems-aware tools for place-based measurement, evaluation and learning.
- Systems Transformation: Strategy and Evaluation – A course to skill you up in developing strategy and evaluating impact in complex systems transformation work using leading edge tools, frameworks, and inner practices
- Two Worlds UMEL – a face-to-face course led by Kowa in partnership with Clear Horizon, exploring First Nations and non-First Nations approaches to understanding, measurement, evaluation, and learning.
The Academy also works with partners on bespoke course delivery and e-Learning design for capacity building. Get in touch today: info@clearhorizonacademy.com





