Conference workshop program: Sunday 28 August 2022
>>> DOWNLOAD a printable conference workshop program
View Monday conference workshop program here.
8–9am REGISTRATION | |||||||
9am–12:30pm WORKSHOP PROGRAM: First half of full day workshops; Morning half-day workshop | |||||||
12:30–1:30pm LUNCH | |||||||
1:30–5pm WORKSHOP PROGRAM: Second half of full day workshops; Afternoon half-day workshop |
WORKSHOP DETAILS
Categories:
A. Foundational evaluation skills and capabilities
B. New tools; approaches and ways of thinking
C. Advanced evaluation topics
Learning to apply the AES First Nations Cultural Safety Framework
presented by Sharon Gollan and Kathleen Stacey HALF DAY (MORNING) | CATEGORY: A, B
In September 2021, the AES launched the AES First Nations Cultural Safety Framework, developed over a 15-month period via a co-design process with the AES Indigenous Culture and Diversity Committee and representatives from the other main AES Committees.
The purpose of this workshop is to provide participants with guidance and support in beginning to apply the AES First Nations Cultural Safety Framework to their evaluation context. Participants are highly encouraged to have a specific evaluation project in mind that involved Aboriginal and Torres Strait Islander people as project managers, evaluation team members or participants.
NOTE: This is not a cultural safety training workshop. Cultural safety training is usually a two-day learning experience. People who have undertaken cultural safety training will find it a beneficial foundation for this workshop. People who have not undertaken cultural safety training may identify it as a valuable and logical next step in strengthening their capabilities in culturally safe evaluation.
By the end of the workshop, participants will:
- Strengthen their understanding of how principles for culturally safe evaluation can be applied in their evaluation context.
- Develop skills in critical self-reflection in working towards culturally safe evaluation.
- Strengthen their understanding of what it means to be an ally for culturally safe evaluation.
The AES Cultural Safety Framework articulates principles for culturally safe evaluation, and describe how critical self-reflection in relation to evaluators, evaluation roles and responsibilities and evaluation practices can contribute to culturally safe evaluation. Participants will have the opportunity to reflect on and discuss how they consider and apply the principles within their evaluation work. They will consider what cultural accountability means in evaluation, identify how they can take up opportunities and gain practice at being an ally.
This workshop aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:
- Domain 1 - Evaluative attitude and professional practice
- Domain 3 - Attention to stakeholders, culture and context
- Domain 6 - Interpersonal skills
- Domain 7 - Evaluation activities
This workshop is aimed at people involved with evaluation in a variety of roles, including commissioners, evaluators, evaluation educators, and managers/staff whose policy, program or project is being evaluated.
About the facilitators
Sharon Gollan, Sharon Gollan & Associates, has strong affinity with and is an active community member of the Ngarrindjeri group in South Australia. She has over 35 years of experience in the public sector in a range of community services and management positions primarily focused on creating better services for Aboriginal peoples, followed by eight years in academic teaching and research before focusing all her time on running her training and consultancy business, as well as contributing to better life outcomes for Aboriginal peoples through Boards, State Committees and Working Groups.
Sharon has invited beyond... to work with her since 1999 in developing and delivering training programs focused on cultural respect and safety as part of organisational change projects and reconciliation action plans (RAPs). They also work together on a regular basis to develop and/or evaluate health, education, family and community service programs at state and national levels.
Kathleen Stacey, beyond...(Kathleen Stacey & Associates) Pty Ltd, is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and training work in human services fields, with a specialist focus in health, education, youth, early childhood, mental health, and family and community support services. It has developed a strong reputation for culturally respectful work in Aboriginal programs and organisations and has worked consistently and collegially with Aboriginal consultants across a range of projects since beyond…’s inception in 2000.
Applying the AES First Nations Cultural Safety Framework in practice
presented by Sharon Gollan, Kathleen Stacey, Doyen Radcliffe, Ginibi Robinson HALF DAY (AFTERNOON) | CATEGORY: A, B
In September 2021, the AES launched the AES First Nations Cultural Safety Framework, developed over a 15-month period via a co-design process with the AES Indigenous Culture and Diversity Committee and representatives from the other main AES Committees. This has been supported by providing a series of ‘Learning to apply the AES First Nations Cultural Safety Framework’ foundation workshops - see the AES online events calendar for dates: https://www.aes.asn.au/evaluation-learning/professional-learning-events.
The purpose of this workshop is to build on the foundational workshop by providing participants with an opportunity to learn from and discuss examples of applying the concepts and approach described in the AES First Nations Cultural Safety Framework.
NOTE: This is not a cultural safety training workshop. Cultural safety training is usually a two-day learning experience. People who have undertaken cultural safety training will find it a beneficial foundation for this workshop. People who have not undertaken cultural safety training may identify it as a valuable and logical next step in strengthening their capabilities in culturally safe evaluation.
Pre-requisite: Attendance at the foundational ‘Learning to apply the AES First Nations Cultural Safety Framework’ workshop in the morning, or previously online, is a pre-requisite.
By the end of the workshop, participants will have reflected on and discussed actual examples of culturally safe evaluation that will:
- Enhance their understanding of how the principles of culturally safe evaluation can be applied in their evaluation context.
- Strengthen their skills in applying critical self-reflection to their own evaluation contexts.
- Enhance their understanding of what it means to be an ally for culturally safe evaluation.
This workshop will be run as a panel session, where two different evaluation teams will share an evaluation example that strove to be consistent with the Framework and facilitate a discussion with participants to explore the application of the Framework. Participants will get the opportunity to walk some of the journey of each evaluation, addressing scenarios or challenges that emerged along the way, and practice applying the concepts and approach in the Framework to identify how they would respond if faced with a similar scenario or challenge in a future evaluation.
This workshop aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:
- Domain 1 - Evaluative attitude and professional practice
- Domain 3 - Attention to stakeholders, culture and context
- Domain 6 - Interpersonal skills
- Domain 7 - Evaluation activities
This workshop is aimed at people involved with evaluation in a variety of roles, including commissioners, evaluators, evaluation educators, and managers/staff whose policy, program or project is being evaluated.
About the facilitators
See biographies above
Introduction to evaluating systems change and place-based approaches
presented by Ellise Barkley and Froukje Jongsma FULL DAY | CATEGORY: B
As any changemaker knows, sometimes you need to think differently and break the rules to really address the underlying causes of complex social problems. But as evaluators, we also need to show how we will measure impact and keep our work on track - even in highly complex and emergent place-based and systems-change settings. This presents many challenges.
This workshop is an introduction to measurement, evaluation, and learning for complexity settings. It unpacks the challenges of evaluation and shares practical methodologies and tools.
Participants will gain an understanding of systems change and place-based approaches, how they differ from programmatic responses, key practice principles, and guidance on minimum specifications for evaluation. It covers how to use theory of change to set stakeholder expectations and develop shared outcomes frameworks suited to context. Participants will be introduced to participatory tools for: measuring social impact and systems changes (that bring together numbers and stories); tracking progress over time; and supporting short and long cycles of learning for transformational change, adaptive leadership, and innovation.
The workshop involves a mix of presentations, case studies, and interactive practice activities. There will be plenty of opportunities to ask questions, share, and explore how the material can be applied to your own context. Content will draw on leading international theory and practice, lots of real-world examples, and insights from the Australian context.
This workshop is ideal for people who understand the fundamentals of measurement, evaluation and learning, but who are looking to expand their knowledge and skills to evaluate systems change/place-based initiatives. Its content addresses the following components of the Evaluators' Professional Learning Competency Framework:
- Domain 1 - Evaluative attitude and professional practice
- Domain 2 - Theoretical foundations (evaluative knowledge, theory, reasoning)
- Domain 3 - Attention to culture, stakeholders and context
- Domain 4 - Research methods and systematic inquiry
- Domain 7 - Evaluation activities
About the facilitators
Dr. Ellise Barkley is Lead Principal Consultant with Clear Horizon and specialises in evaluating systems change and place-based initiatives. Ellise has over 20 years of experience in program design and evaluation and is a highly skilled facilitator and educator. Ellise played a key role in developing the national Place-based Evaluation Framework (in partnership with Australian and Queensland Governments and Logan Together). She has also been instrumental in developing the content for Clear Horizon’s ‘Evaluating Systems Change and Place-based Approaches’ online course and is the lead learning mentor for the course.
Froukje Jongsma brings over 15 years of experience in social impact work. Since 2016, her focus has been on leading place-based systems change and building the capacity of cross-sector stakeholders including citizens to collaboratively tackle complex social issues in their communities. Froukje is adept at facilitating social impact projects, programs, and processes, including leading measurement, evaluation, and learning (MEL) design and delivery. Froukje also has extensive experience facilitating workshops and learning in community, professional, and education contexts using a combination of techniques and approaches such as experiential learning to accommodate different learning styles and build capacity.
Introduction to surveys for evaluation
presented by Mark Mackay FULL DAY | CATEGORY: A
Evaluators need to collect of a variety of data that can be used to inform their evaluation work. Surveys provide a convenient mechanism to collect data from many people in a relatively quick time.
Despite the plethora of survey tools that are now readily available, conducting a high-quality survey is not a matter of just selecting the survey tool! Understanding when to use a survey, how to construct questions, survey logic, survey deployment, and data capture are key … add to that the need to understand how to handle incomplete data and analysis. Thus, it’s not for novices! If you need to understand how surveys can be used for evaluation, then this workshop is for you!
In this workshop interactive the aim will be to equip participates with the knowledge and skills to begin to design their own survey and consider how the survey results will be subsequently analysed. Participants will learn:
- A workflow that will take them from the idea of “let’s use a survey” through to analysis of the results
- The importance of understanding the difference between open-ended and closed-ended questions
- The importance of considering how many questions respondents should be asked in a survey
- How to construct survey questions – including considering response options
- The types of responses that can be gained from surveys and how some response types can be used to help overcome language barriers
- The value of using a survey to collect open-ended responses to better understand some aspects of the subject matter that is being evaluated
- Understanding survey logic as part of the survey design
- Data storage considerations
- The need to pilot surveys before widescale deployment
- The challenges involved in getting responses
- The advantages and disadvantages of using the standard survey reports that now accompany online survey tools
- About the other options available for analysis of the data – particularly open-ended responses.
This workshop aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:
- Domain 1 – Evaluative attitude and professional practice
- Domain 2 - Theoretical foundations (evaluative knowledge, theory, and reasoning)
- Domain 3 - Attention to culture, stakeholders and context
- Domain 4 - Research methods and systematic inquiry
- Domain 5 - Evaluation project management
- Domain 7 - Evaluation activities.
This workshop is aimed at individuals who work in evaluation and are wanting to develop a better understanding of the use of surveys as an evaluation tool.
About the facilitator:
Adj Professor Mark Mackay is well qualified [BSc(Hons) BEc BComm Grad Cert (Higher Educ) PhD] and has extensive experience in teaching postgraduate students in Australia and overseas. Teaching has included data collection methods, including surveys. As well as teaching postgraduate students, Mark has convened international conferences on the use of design and data in health and was a keynote speaker at the Festival of Evidence Conference held in the UK during 2014.[
As a designer of postgraduate courses and a passionate educator, Mark is well versed in the methods of workshop facilitation for adult-style learning. Mark is passionate in ensuring that participants at his workshops gain real practical skills, as well as imparting useful knowledge that can be applied in the workplace.
As well as teaching, Mark has been a director with Complete the Picture Consulting for 13 years where he has been involved in the design, deployment and analysis of surveys for clients.
Mark has more than 30 years of experience in working in industry and academia, much of the time using data. Mark has undertaken reviews of organisations, evaluations of programs, services and products across a variety of sectors including health, aged care, education, the energy and the retail sectors. Mark has used surveys as part of the tool kit he employs in evaluations.
Designing credible and useful impact evaluations
presented by Brad Astbury and Andrew Hawkins FULL DAY | CATEGORY: B
This workshop focuses on the principles and logic associated with the selection, design, and application of different kinds of impact evaluations, ranging from randomised control trials to theory-driven and qualitative approaches to causal inference. Different approaches to impact evaluation will provide more or less accurate (as well as precise) answers to different questions. For example:
- Did the program have an impact?
- What was the size of the program’s impact?
- What was it about the program that had impact?
- In what situations and for which people did the program activities have an impact?
- Should we continue the program, scale it up, modify it, better target it, or do something different to maximise future impact?
A single approach or combination of approaches may be more or less appropriate depending on the questions you are seeking to answer and other contingencies, such as the nature of the evaluand and the evaluation setting (e.g. time, budget, data availability, stakeholder information needs, degree of uncertainty that can be tolerated and evaluator expertise).
Practical aspects of delivering high-quality and useful impact evaluations will be explored in this workshop through case applications. Participants will learn about:
- Different understandings of impact evaluation, including:
- Differences between causal descriptions and causal explanations
- Differences between causal inference and effect sizes
- The way that systems approaches deal with causality and attribution
- Approaches to classifying quantitative and qualitative impact evaluation designs
- Ways in which impact evaluation designs differ and what this means for practice
- Beyond the hierarchy of methods: dangers involved in relying too heavily on any one approach to conducting impact evaluation
- A contingent approach: considerations for selecting and combining impact evaluation designs and methods based on situational analysis.
This workshop aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:
- Domain 1 – Evaluative attitude and professional practice
- Domain 2 – Evaluation theory
- Domain 4 – Research methods and systematic inquiry
The workshop is designed for both new and experienced evaluators and commissioners of evaluation.
About the facilitators:
Brad Astbury is a Director at ARTD Consultants. He has two decades of experience in evaluation and applied social research and considerable expertise in combining diverse forms of evidence to improve both the quality and utility of evaluation. He has managed and conducted needs assessments, process and impact studies and theory-driven evaluations across a wide range of policy areas for industry, government, community and not-for-profit clients. Prior to joining ARTD in 2018, Brad worked for over a decade at the University of Melbourne, where he taught and mentored postgraduate evaluation students.
Brad is passionate about innovation in evaluation theory and practice and publishes regularly in leading journals. Some examples include articles on the contribution of leading evaluation theorists in Evaluation, the importance of program theory in the American Journal of Evaluation, the principles of using co-design to generate health service improvements in BMC Public Health, thoughts on the Future of Evaluation, and a discussion on criminal justice evaluation for the Advances in Program Evaluation book series. His latest work addresses the practice of realist evaluation and non-experimental approaches to impact evaluation. Brad is also an active contributor to professional societies for evaluators in Australia, Europe, Canada and America.
Andrew Hawkins is a Partner at ARTD Consultants where he has worked for the last 15 years. He is passionate about philosophy of science and the logic of evaluation. Like Brad he often works with a realist understanding of causality but often looks first to quantitative methods. As a result of his great passion for credible and useful impact evaluation he sought to combine experimental methods with realist understanding of causality. More recently Andrew has taken a pluralist approach to causality and considers that different theories of causality are more or less useful in different situations. Ever focused on practical application and the fair price of causal information Andrew may have his head in the clouds but his feet and hands are firmly rooted in the practice of everyday evaluation.
Conflict resolution skills: A toolbox for evaluators
presented by Ruth Pitt FULL DAY | CATEGORY: A
King and Stevahn (2005) argue that “…to experience a program evaluation is to experience conflict”. Conflict resolution skills such as mediation, facilitation and negotiation are essential for evaluators to manage the interpersonal and political aspects of their work.
This workshop is an evaluation-specific, practical introduction to conflict resolution, providing participants with a toolbox of skills, useful resources, and directions for future competency development.
By the end of the workshop, participants will be able to:
- Explain the importance of conflict resolution skills for conducting and managing evaluations
- Describe the five conflict resolution styles, assess the appropriate style for a given situation, and identify their stronger and weaker styles
- Distinguish between positions and interests, and construct collaborative problem statements based on interests
- Describe the ‘diamond of participatory decision making’ and other useful frameworks for facilitation.
The workshop includes presentations on frameworks and tools, and opportunities to practice skills through individual and group activities.
The workshop is suitable for participants who have little or no training in conflict resolution skills, with any level of evaluation experience. Participants will be asked to reflect on evaluation projects, but those new to evaluation will be able to draw on experience in other contexts. The workshop will support evaluators to develop competency in professional competency standards from Domain 5 (Project management) and Domain 6 (Interpersonal skills).
About the facilitator:
Ruth Pitt has worked in diverse evaluation roles across government, not-for-profit and consultancy, in Australia and overseas. Ruth has had the opportunity to see the challenges of managing and conducting evaluation projects from many different perspectives. As a result, Ruth is a firm believer that conflict resolution skills are a key competency for evaluators.
Ruth is an Evaluation Specialist at Collective Action, a social impact consultancy; and Research and Evaluation Lead at Health Justice Australia. Her qualifications include a Graduate Certificate in Conflict Resolution and a Certificate IV in Training and Assessment.