CES - SCÉ 2019
Sunday 26/05/2019
9:00 am - 4:00 pmFrench Language session: Pivot tables: The Must-Have Tool for Data Analysis and Interpretation
Alla Mitriashk, Ministère de l'Énergie et des Ressources naturelles, Ministère des Forêts, de la Faune et des Parcs
THIS WORKSHOP WILL BE DELIVERED IN FRENCH ONLY.
LEVEL - INTERMEDIATE
Prerequisites
Know Excel at intermediate / advanced level preferably; have a minimum of one year of experience in the evaluation of public programs. The workshop can also be interesting for graduate students in program evaluation.
 
When evaluating the implementation or revision of a program, the evaluator is often confronted with a multitude of operational data. How can one navigate this sea of information without being submerged? Workshop participants will be taught a five-step process for designing and using a pivot table in Excel for data interpretation. They will learn how to organize the project, identify goals and indicators, specify measure parameters, build the database and mine it by building “eloquent” tables linked to evocative diagrams. Real-life examples drawn from two case studies will be examined. One involves the revision of a commercial forest work investment program, while the other deals with the evaluative summary of the implementation of a new research support program in the mining sector.
9:00 am - 4:00 pmTrauma-informed and De-colonizing Evaluation Practices
Martha Brown, RJAE Consulting
LEVEL - INTERMEDIATE
Prerequisites:
Basic evaluation knowledge of qualitative methods

In this experiential workshop, participants will learn how to recognize the effects of trauma and intergenerational trauma, and how to conduct evaluations through a trauma-informed lens. Skills learned will strengthen evaluators' capacity to conduct culturally responsive evaluations without replicating colonizing and oppressive practices that cause harm or trigger trauma responses. Participants will learn how to keep a Circle in a variety of evaluation contexts. Attendees will actively participate in Circles, discussions and role-plays and will have the opportunity to practice keeping a Circle. Circles will focus on building trust, connections, establishing group values and solving problems. Attendees will benefit from improved trauma-informed methods and practices that facilitate listening, empathy, voice, trust, connectedness, and understanding.
9:00 am - 4:00 pmDevelopmental Evaluation Master Class
Jamie Gamble, Imprint Consulting Inc.
Kate McKegg, Director, The Knowledge Institute Ltd
Nan Wehipeihana, Director, Research Evaluation Consulting Ltd.
LEVEL - ADVANCED
Prerequisites:
Clear understanding of DE concept, purpose and principles. Practical experience in the design and delivery of at least one developmental evaluation. Familiarity with innovation, complexity and systems thinking.

The aim of this workshop is to explore technical, situational and relational issues and insights that come in the implementation of DE. This session will be co-led by three leading developmental evaluators, Kate McKegg, Jamie Gamble and Nan Wehipeihana.

As Developmental Evaluation (DE) becomes more widespread, there is a growing range of applications and experiences. DE is increasingly an accompaniment to social innovation labs, social justice initiatives, co-design, community change efforts such as Collective Impact and indigenous development. This workshop will examine the implications of DE in these contexts, and practical strategies and considerations for evaluation design and implementation.

The workshop will also explore design, implementation and utilization issues in DE as they relate to the eight DE principles introduced in Michael Quinn Patton's Developmental Evaluation Exemplars.
9:00 am - 12:00 pmEmbedding Fun to Facilitate Evaluation Use
Wendy Tackett Ph.D., iEval
Paul Tackett
LEVEL - INTERMEDIATE
Prerequisites:
All evaluators and evaluation recipients can learn something from this workshop. It would be ideal if participants bring an evaluation report they are currently working on to use in one of the activities. Knowledge of Microsoft Excel is helpful for another of the activities, but it is not necessary.
 

Sometimes evaluators get so focused on the evaluation contract and performing the work agreed upon in that contract that we forget about the ultimate purpose of our evaluation work --; to provide an evaluation process that is useful and findings that are used. How, as evaluators, can we bridge this gap, using processes that provide credibility to our work while also involving clients strategically to ensure that our work is meaningful, useful, and used to guide decision-making and improve outcomes?

Through this workshop, you will learn how to embed fun into evaluation and think creatively to facilitate the use of evaluation findings. We will walk this journey together by participating in a paired comparisons activity (parsing an evaluation report into digestible and actionable chunks), learn how to stage your own evaluation camp, and practice color scripting (a technique for understanding the immediate impact of a training).
9:00 am - 12:00 pmCommunication Strategies: Bridging the Abyss between Evaluation Reports and Evidence-informed Action
Erica McDiarmid, DPRA Canada Inc.
LEVEL - BEGINNER

Effective communication is an essential competency in evaluation given that 'how' and 'when' evaluation results are communicated can influence how the stakeholders are able to respond and react. This interactive workshop will present best practices in effective communication strategies for using evaluation for action. In the workshop, participants will be presented with communication strategies that can be utilized across the project life-cycle and provided with tips and tricks for communicating the 'good' and 'not so good' evaluation results in a way that increases stakeholder buy-in and likelihood for action. By understanding how stakeholders can be engaged in the evaluation process, there are increased opportunities on how results can turn into action through evidence-informed decision making. Through this workshop, participants will be able to develop their own plan on how they can bridge their evaluation results with evidence-informed action.
9:00 am - 12:00 pmIntroduction to Sustainability-Ready Evaluation
Andy Rowe, ARCeconomics
LEVEL - ADVANCED
Prerequisites:
Participants require familiarity with evaluation methods, designing evaluations and with mixed methods evaluations. They should also have experience working with programs and evaluation commissioners to articulate the program logic and negotiate evaluation issues. They need to be open to or at least interested in stretching conventional evaluation boundaries and see evaluation and evaluators as contributors to change.
 

The importance of sustainability is rising and undeniable, yet sustainability is largely missing from evaluation. This workshop will introduce important sustainability concepts and practices and techniques for evaluators to identify and introduce sustainability to their evaluations. Targeting more experienced evaluators and evaluation commissioners with capacities in mixed methods the workshop will facilitate participants' understanding of coupled human and natural systems and how to introduce sustainability to theories of change and manage the politics that will arise. Participants will also learn important differences between human and natural systems including temporal and spatial scales, social and natural science methods, promoting use and the importance of boundary spanning. Participants will be asked to contribute to enhancing a recently published checklist for sustainability-ready evaluation.

Discussion will be the primary format for the workshop with short introductory lectures and stories and small group work.
9:00 am - 12:00 pm Designing Quality Survey Questions
Sheila Robinson, Custom Professional Learning, LLC
LEVEL - BEGINNER

Floyd Fowler claims, "Poor [survey] question design is pervasive, and improving questions design is one of the easiest, most cost-effective steps that can be taken to improve the quality of survey data (1995, p. vii).

Surveys are revered for their ease of use and promise of reaching many, but evaluators must be able to craft rich, concise, and targeted questions that yield useful data. We must understand cognitive processes respondents use to answer questions with accuracy and candor. With rich examples and an interactive approach, the facilitator will demonstrate why we must engage in a respondent-centered survey design process to craft high quality questions. Participants will learn the survey design process through a series of activities, developing an understanding of the cognitive aspects of survey response and question design. They will increase their ability to craft high quality survey questions, and leave with resources for further skill development, including a copy of the facilitator's survey question checklist, published in her book.
12:00 pm - 1:00 pmLunch will not be provided for Sunday Workshop participants. A list of restaurant options within walking distance of the venue will be available at the Registration desk.
1:00 pm - 4:00 pmReflective Practice: The Bridge to Innovation
Gail Barrington, Barrington Research Group, Inc.
LEVEL - INTERMEDIATE
Prequisites:
Participants can be evaluation practitioners at any level of experience but should have field experience, with an awareness of the problems and challenges that can be encountered in practice and understanding the importance of reflecting, reassessing, and developing fresh, new solutions.
 
Reflective practice is double-loop learning in action. If we think creatively about better outcomes, we can allow ourselves to experiment, innovate, and refocus, expanding our skills and adding value to our work. This workshop will explore why we should incorporate reflection more fully into our working lives. What barriers and issues stand in the way of innovation? What questions should we be asking? What reflective strategies can we use, and if we use them, what are the implications? Three reflective strategies will be described, and participants will have an opportunity to experiment with them and discuss their experiences. We will reflect on the links between reflection, innovation, and action and will leave with a personal strategy to incorporate into practice.
1:00 pm - 4:00 pmMaking Theories of Change Meaningful: Trouble Shooting & Technology
Elaine Stavnizky, Harry Cummings & Associates
Paula Richardson, Salanga
LEVEL - INTERMEDIATE


A program Theory of Change (ToC) can provide an important basis for evaluation by making explicit the results an initiative aims to achieve, why & how that change will happen, & key underlying assumptions. ToCs are a key tool to bridge understanding between organizations & clients or donors. Despite this, developing a robust ToC can be challenging & operationalizing it & evaluating it even more so. This engaging, hands-on workshop will build on participants' existing knowledge of ToCs & reinforce their "bridge" by increasing their ability to analyze a ToC to ensure it has the key elements for meaningful monitoring & evaluation of programs. Participants will be asked to bring along a current ToC and work with it throughout the workshop. Focusing on participants' pain points with ToCs, the workshop will work through common challenges in working with ToCs, including making assumptions explicit, developing robust indicators, fitting ToCs into logic models or frameworks. Participants will also learn how technology can help design & track their outcomes.
1:00 pm - 4:00 pmFront-end Evaluation Planning: Using an Exploratory Evaluation Approach
Jacqueline Singh MPP, PhD, Qualitative Advantage, LLC
LEVEL - BEGINNER

This workshop introduces front-end evaluation planning using an exploratory evaluation approach useful for formative, summative, and developmental evaluations. Evaluability assessment (EA) should be undertaken at the front-end to assess whether a program is ready for future evaluation not whether it can be evaluated. EA clarifies goals, inputs, activities, outputs, and intended short-term, intermediate, and long-term outcomes. Through real-time examples and hands-on activity, participants learn about EA and its practical utility for emerging or existing programs to: a) clarify stakeholders' expectations; b) focus alignment between program components, goals, and objectives; c) identify mental models and implicit assumptions; d) articulate purpose and the "right" questions for assessment, evaluation, research, and/or performance measurement; e) determine how best to approach evaluation design, and f) collect data useful for meaningful decision-making. Although not necessary, participants may bring along a description of a program or other type of intervention they are working on.
1:00 pm - 4:00 pmBridging Qualitative Data Analysis and Quality Control
Benoit Gauthier, Circum Inc., Carleton University, Ecole nationale d'administration publique
Simon Roy, Goss Gilroy Inc.
LEVEL - INTERMEDIATE
Prerequisites:
Participants must have basic knowledge in qualitative analyses, as well as basic knowledge of how typical spreadsheet software operates.
 

Qualitative data analysis and quality control in program evaluations often lack rigour, leading to qualitative findings being discredited or used inappropriately. In this context, the workshop will present intermediate-level techniques to increase both the quality and usefulness of qualitative data. With exercises and examples of good and poor practice, the following will be covered: systematization of data coding and analysis using accessible spreadsheet software; matrix and narrative analysis techniques; pros and cons of proportion analyses; developing meaningful streamlined summaries; combining multiple lines of evidence; and key quality control techniques throughout qualitative data planning, collection, analysis and reporting. While the methods will be applicable to most qualitative methods, the workshop will focus on semi-directed interviews.
4:00 pm - 6:00 pmFree Time
6:00 pm - 9:00 pmOpening Reception, Silent Auction and Poster Presentations
Kick off the conference in a relaxed atmosphere as you participate in a silent auction and mingle with fellow delegates. Proceeds benefit the Canadian Evaluation Society Educational Fund (CESEF). Ticket included in your conference registration. If you wish to donate an item for the auction, please contact Mariane Arsenault (marsenault@univeralia.com). For more information about CESEF or make a cash donation, please visit http://cesef.memberlodge.org/.

Wednesday 29/05/2019
1:00 pm - 4:00 pmDicey Digits: When Statistics are Misunderstood
Speaker: Elaine Stavnizky, Harry Cummings & Associates
LEVEL - INTERMEDIATE
 
Pre-requisites:
It is expected that participants will have previously taken a basic statistics course. Ideally, they will have had some exposure or experience designing sample sizes, conducted surveys & done some quantitative analysis, but they may have only been doing descriptive analysis to date.
 

In this age of information and technology: donors are looking for more evidence of impact and proof that approaches are evidence-based. With the tens of thousands of dollars that are spent on data collection, we should be able to measure impact more reliably and accurately. Most evaluation consultants know enough about statistics to be able to determine a typical sample size and conduct basic descriptive analysis, in other words: just enough to be dangerous. Few present the information in terms of the quality of the resulting data by describing confidence intervals, speaking to statistical significance, or effect size. Few do the due diligence of conducting proper statistical testing before drawing conclusions based purely on the results alone. The aim of this workshop is to help participants build on their existing statistical knowledge and use it to bridge understanding of the data results and their meaning to their clients in ways that maintain the integrity of the findings.
1:00 pm - 4:00 pmEvaluative Thinking to Bridge Inquiry, Evidence, and Learning
Speaker: Thomas Archibald, Virginia Tech
Speaker: Jane Buckley, JCB Consulting
LEVEL - BEGINNER

How does one "think like an evaluator"? How can program implementers learn to think like evaluators? Recent years have witnessed an increased use of the term "evaluative thinking," yet this particular way of thinking, reflecting, and reasoning is not always well understood. Patton warns that as attention evaluative thinking has increased, we face the danger that the term "will become vacuous through sheer repetition and lip service" (2010, p. 162). This workshop can help avoid that pitfall. Drawing from our research and practice in evaluation capacity building, in this workshop we use discussion and hands-on activities to address: (1) What evaluative thinking (ET) is and how it pertains to your context; (2) How to promote and strengthen ET among individuals and organizations with whom you work; and (3) How to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management.
1:00 pm - 4:00 pmThe Success Case method: Evaluation as a Bridge to Program Impact
Daniela Schroeter, Western Michigan University
LEVEL - BEGINNER

The Success Case Method (SCM) is a theory-driven, utilization-focused, participatory approach to impact evaluation that uses mixed methods questionnaires and interviews to identify high and low impact examples with the ultimate goal of informing improvement of processes and results of an intervention. This hands-on workshop introduces the SCM and present opportunities for practicing elements of the method in the workshop. By the end of the workshop, participants will understand the steps involved in applying the SCM, create a theory-driven impact models based on a case scenario, draft question sets suitable for identifying high and low success cases via web-based questionnaires, and develop possible interview questions for documenting stories of success and opportunities for improvements. The workshop concludes with a discussion of strengths, limitations, and opportunities associated with using the SCM in a range of evaluation contexts.
1:00 pm - 4:00 pmPlotting a Dynamic Journey: Intermediate Excel to Master Pivot Tables and Conditional Formatting for Quicker Thematic and Data
Carolyn Hoessler, Ryerson University
LEVEL - INTERMEDIATE
 
Prerequisites:
Need to be able confidently enter numbers into Excel, create averages, and copy and paste in Excel. Need to be able to interpret data tables.
 

In many projects, I find myself in a room of stakeholders discussing the data. As questions emerge, it becomes evident that they are interested in just-in-time data.

It takes skill as an evaluative leader to guide the discussion through increasingly complex datasets and even more complex interpersonal and power dynamics: being able to utilize and adapt pivot tables, charts, and conditional formatting can inform evidence-based change through engaging data discussions with stakeholders. This workshop is intended for individuals familiar with data entry in typical Excel datasheets with rows and columns.

Through this workshop you will develop technical practice competencies related to analyzing and interpreting data (new competency 2.8) and group facilitation skills related to data discussions (new competency 4.5). Recognizing and planning for the use of pivot data tables and charts that auto-update also has implications for effectively using human, financial and technical resources (new competency 4.3).
1:00 pm - 4:00 pmCE Application Clinic
Nataie Kishchuk, CES Vice-President, Canadian Evaluation Society
Marthe Hurteau, Université du Québec à Montréal
This half-day clinic is intended to help applicants for the CES Credentialed Evaluator professional designation advance and complete their applications. The clinic will be free for those who have registered their application. Participants should bring a laptop and Word versions of all necessary documentation. Two bilingual Credentialing Board members will provide guidance and tips on preparing and completing the application.