The following is a schedule of EERS conference presentations and activities, in its most current form, with sessions added as they are confirmed and scheduled. This page is designed to allow attendees to follow the program on a mobile device.
Note that presentations listed for the same time in the same location are grouped as a concurrent session. They will be presented in the order listed during their scheduled time slot.
How do we go about setting and answering key learning questions that will both build real-time evaluation capacity and feed into program implementation and outcomes achievement? Rapid cycle learning (RCL) has been used in places as varied as the military, the private sector, emergency response, and healthcare. As evaluators we can use RCL to institutionalize […]Click for more information on 'Using Rapid Cycle Learning to Infuse a Culture of Continuous Learning'
Evaluators are motivated individually and professionally to do the right thing. We want to make sure every voice is heard and that we tell a story that is true and empowering. Our traditional tools such as focus groups and interviews have limitations and often require the evaluator to choose between breadth and depth. We believe […]Click for more information on 'Doing the Right Thing: Community Engagement and Empowerment'
An opportunity to join EERS board members and conference attendees for an informal social and networking with snacks and refreshments.Click for more information on 'Hospitality Suite Networking'
Self-serve breakfast buffet for all participants.Click for more information on 'Continental Breakfast'
Sherry Glied is featured in this official welcome and formal kick-off of the EERS annual conference. The Commission on Evidence-Based Policymaking released its report in September 2017, and both Congress and the Executive branch have taken steps to put into place several of its recommendations. This presentation will summarize the Commission’s work and address some […]Click for more information on 'Opening Breakfast Plenary – Data, Evidence, and Evaluation: Balancing Opportunities and Risk'
Ouen Hunter, M.S., M.S.W., Western Michigan University Jeffrey Hillman, M.P.A., Western Michigan University This interactive session will use hands-on exercises to emphasize how verbal and non-verbal communication skills will help evaluators gather and tell the stories of their clients and organizations, based on the presenters’ experiences in clinical social work and health care administration.Click for more information on 'Effective Communication Skills While Facilitating Interviews and Focus Groups'
Miriah Russo Kelly, Evaluation Specialist, University of Connecticut Teresa McCoy, Assistant Director of Evaluation & Assessment, University of Maryland Extension This presentation seeks to inspire participants to think differently about their evaluation communication efforts, and will arm participants with practical ways to integrate appropriate communication techniques into their practice.Click for more information on 'Weaving Communication into the Fabric of Evaluation'
Keith Trahan, PhD, Associate Director Stephanie Maietta Romero, EdD, Research Associate Renata Almeida Ramos, Graduate Student Researcher Cynthia Tananis, EdD, Director, Center for Evaluation and Assessment of Capacity, School of Education, University of Pittsburgh This presentation will report on year one of a two-year National Science Foundation (NSF) funded project working to uncover and describe […]Click for more information on 'Making Success: Evaluating a School District’s Integration of Making into its Middle and High School'
Jill Hendrickson Lohmeier, Ph.D, University of Massachusetts Lowell Shanna Thompson, Ed.D., University of Massachusetts Lowell Nadine Ekstrom, C.A.G.S., University of Massachusetts Lowell This presentation will describe an evaluation of a rapidly growing urban district’s English Learners program.Click for more information on 'Comprehensively Evaluating an Urban District’s Growing English Learners Program'
Kristyn Stewart, Senior Research Associate, The School District of Philadelphia Ryan Fink, Research Specialist, Consortium for Policy Research in Education This session will discuss the implications of late kindergarten registration on a large urban school district and the collaborative evaluation of the registration process to identify barriers to inform and influence policy and practice.Click for more information on 'Using Evidence to Evaluate the Barriers to On-Time Kindergarten Registration and Inform Policy and Process'
More than ever, evaluators must embrace principles and practices that allow them to advance diversity, equity, and inclusion (DEI) within their organizations – whether independent firms, academic centers, or evaluation teams. Evaluators also play an important role in helping clients address DEI issues. By engaging in equitable approaches that are actionable in their organizations, evaluators […]Click for more information on 'Embracing DEI as an Organizational Value: What We’ve Learned on this Journey with our Team and Clients'
The Executive and Legislative Branches of government share common challenges in successfully implementing evidence initiatives. This panel will highlight lessons learned about the generation and use of evidence in the Executive Branch during the Bush and Obama administrations. A second presentation will highlight the institutional challenges in the Congress for using evidence successfully across the […]Click for more information on 'Soup-to-Nuts: Understanding the Challenges of Generating and Using Evidence in Government'
Michael Long, Principal, ICF Katelyn Sedelmyer, Associate, ICF This presentation offers insight into how federal grantees are being encouraged to incorporate evidence in a variety of ways, and the implications for evaluators.Click for more information on 'Embracing Evidence: Lessons from Supporting Federal Evidence-Based Grant Making Programs'
Matthew Von Hendy, Principal, Green Heron Information Services This presentation will cover some of the publicly available longitudinal studies that are being done in the US and Europe, provide resources to locate additional longitudinal studies, and suggest ways to locate longitudinal studies in the peer-reviewed and grey literatures.Click for more information on 'Tracking Longitudinal Studies'
Alexandra Ernst, Project Manager, Public Health Management Corporation (PHMC) Sue McLain, Public Health Program Manager, Pennsylvania Department of Health Laura McCann, Project Manager, Public Health Management Corporation Mark Modugno, Public Health Program Administrator, Pennsylvania Department of Health Duane Barksdale, Research Assistant, Public Health Management Corporation Jennifer Dickson Keith, Deputy Director, R&E Group As part of […]Click for more information on 'Mapping as a Policy Evaluation Tool: Lessons from Pennsylvania’s Tobacco Prevention and Control Program (PADOH)'
Join current AEA president Leslie Goodyear to learn about recent AEA initiatives related to Equity, Ethics and Evidence, and engage in dialogue about important issues facing the field of evaluation. Includes a catered sit-down lunch. What’s going on in the American Evaluation Association? And how can you get involved? Leslie’s talk will connect the EERS […]Click for more information on 'Lunch Keynote – Equity, Ethics and Evidence: What’s new in AEA and the field?'
This panel will discuss struggles to uphold equity and ethics, and provide evidence in a big messy evaluation of an education initiative. This large-scale evaluation targeted teacher and leader professional development with the ultimate goal of improving student achievement. This three-study evaluation aimed to study the initiative’s theory of action and examine the implementation and […]Click for more information on 'Issues of Equity, Ethics, and Evidence in a Big Messy Evaluation'
Cheryl M. Ackerman, Director of Evaluation, Delaware Environmental Institute, University of Delaware Jennifer Gallo-Fox, Assistant Professor, Human Development and Family Sciences, University of Delaware Danielle Ford, Associate Professor, School of Education, University of Delaware Susan McGeary, Associate Professor, Department of Geological Sciences, University of Delaware This session will explore how an opportunity to examine methodological […]Click for more information on 'Is it Completely Dead, or Can We Save it? An Examination of a Real-world Evaluation Dilemma'
Stacey S. Merola, President and Principal Scientist, Merola Research LLC This presentation will engage the audience in a discussion of common methodological problems that have led to the generation, and possible subsequent widespread dissemination, of faulty evidence.Click for more information on 'Common Methodological Issues Related to Faulty Evidence'
Apollo M Nkwake, Associate Research Professor of International Affairs, The George Washington University’s Institute for Disaster and Fragility Resilience, Elliott School of International Affairs In this presentation, the critical validation of assumptions in evaluation is discussed as essential to effective and ethical evaluations and the achievement of equitable growth. This presentation will discuss a typology […]Click for more information on 'Working with Assumptions to Enhance Program Effectiveness'
Patricia Moore Shaffer, Ph.D., President, Shaffer Evaluation Group Kristi E. Wagner, Ed.D., Senior Research Associate, Shaffer Evaluation Group The session presents a case study of a capacity building project with a state level agency seeking to increase the evaluation capacity of their grantees through training and technical assistance, as well as building their own evaluation […]Click for more information on 'Better Programs through Evaluation Capacity-Building: A State Level Example'
Robert Roach, Senior Consultant, Equal Measure Laiyi Wei, Consultant, Equal Measure We will present how we process and analyze varied indicators that collective impact partnerships use to track their progress toward their goals; how we have presented this information to various audiences; and how stakeholders have leveraged this data to demonstrate the effectiveness of work […]Click for more information on 'Balancing Standardized and Unique Quantitative Indicators of Progress'
Rachael Doubledee, Doctoral Research Assistant, Montclair State University Derek Morgan, Graduate Research Assistant, Montclair State University Anna Maria Gilgar, Research Assistant, Montclair State University Lauren Gama, Research Associate, Montclair State University Johanna S. Quinn, Postdoctoral Researcher, Montclair State University E. Danielle Roberts, Postdoctoral Researcher, Montclair State University Miriam R. Linver, Professor, Montclair State University Jennifer […]Click for more information on 'Student Award Presentation – Extending the Golden Spike to include Evidence and Practice Mapping of Boy Scouts of America'
Lorien E. MacAuley, PhD; Instructor; Department of Agricultural, Leadership, and Community Education; Virginia Tech Kim L. Niewolny, PhD; Associate Professor; Department of Agricultural, Leadership, and Community Education; Virginia Tech Thomas G. Archibald, PhD; Assistant Professor; Department of Agricultural, Leadership, and Community Education; Virginia Tech This presentation describes how our collective impact approach with Virginia beginning […]Click for more information on 'A Collective Impact Approach: Implications for Network Development and Evaluation Capacity Building'
Andrew MacDonald, Senior Associate, ICF Michael Long, Principal, ICF This presentation outlines practical strategies researchers can use to help grassroots organizations develop their first theory of change and logic model and prepare for an evaluation.Click for more information on 'Turning Practice into Theory: Helping Grassroots Organizations Develop Theories of Change & Logic Models'
Michelle Munsey, Evaluator at Partnerships for Health, LLC Sarah Lewis, Program Coordinator at Maine Access Immigrant Network This presentation will focus on employing a community-driven approach to developing evaluation tools to ensure the incorporation of community voice into the data collection process and to promote the collection of quality data.Click for more information on 'Community-Driven Tool Development: A Catalyst for Evaluation Success'
Michelle Mitchell, Executive Director at Partnerships For Health, LLC David Bell, Professor at Clark University Evaluation is a relatively young field of practice with practitioners and theorists emerging from many different disciplines. The convergence of disciplines has resulted in a rich and eclectic field of practice and a multitude of distinct and sometimes disparate theories […]Click for more information on 'From Practice to Praxis: Orienting New Evaluators to the Intersection between Theorists and Practitioners'
Jacqueline Toppin, M.S. in International Affairs, The New School Tim Sughrue, M.A., The New School This presentation will explore how a remotely designed monitoring and evaluation toolkit could augment Educate the Children’s ability to provide innovative services to alleviate the impact of poverty on Dalit and Janjati women’s lives and improve their health and well-being […]Click for more information on 'Student Award Presentation – Dalit and Janjati Women Breaking the Cycle of Poverty: Remote Design M&E for ETC'
Marje Aksli This presentation will explore equity, ethics, and evidence in assessing Canada’s feminist international development policy.Click for more information on 'Is Feminist International Development Policy Effective?'
Angelo Gamarra, Monitoring, Evaluation and Learning Specialist, Vital Voices Global Partnership This presentation offers valuable lessons in understanding context and inequities of women leaders in Kenya, India and Argentina while creating an ethical and safe space to gather evidence.Click for more information on 'Women Leadership and Empowerment: Understanding Context and Culture for Ethical Data Collection'
Ashley Tetreault, Evaluator, Partnerships for Health This presentation will discuss how evaluation may hold the key to telling the story of public health and making it more concrete and visible through the collaborative effort among communication experts and evaluators in order to articulate results in an effective and efficient way.Click for more information on 'Increasing the Visibility of Public Health: Integrating Evaluation and Communications'
Ishwar Bridgelal, Research Associate, CUNY Center for Advanced Study in Education Deborah Hecht, Interim Director and Senior Research Scientist, CUNY Center for Advanced Study in Education This presentation describes an attempt to devise a collective impact model as an evaluative framework for organizational harmony in a large urban postsecondary achievement initiative entering its fourth year […]Click for more information on 'Applying a Collective Impact Model to Evaluate an Ongoing Urban Postsecondary Achievement Initiative'
Apollo M. Nkwake, The George Washington University’s Institute for Disaster and Fragility Resilience Ky Luu, The George Washington University’s Institute for Disaster and Fragility Resilience Deborah B. Elzie, The George Washington University’s Institute for Disaster and Fragility Resilience Eric Corzine, The George Washington University’s Institute for Disaster and Fragility Resilience Courtni Blackstone, The George Washington […]Click for more information on 'Measuring Capacity Strengthening: Lessons from the Disaster Resilience Leadership Program'
Though youth character education programs are internationally popular, a lack of formal evaluation compounded by limited time, funding, evaluation knowledge and capacity makes their impact unclear. The PACE Project offers an innovative solution through evaluation capacity building matching 8 evaluators and 32 program staff from across the country through a multi-faceted year-long program. PACE prepares […]Click for more information on 'Building Evaluation Capacity through the Partnerships for Advancing Character Program Evaluation Project'
Kirsten Pagan, Student Engagement Platform Coordinator, Office of Student Affairs Assessment, Binghamton University Miriam Bartschi, Assessment Consultant, Office of Student Affairs Assessment, Binghamton University The session will begin by defining the problem of bias in assessment. Participants will explore this issue and identify actionable strategies to mitigate its effects. The presenters will facilitate dialogue among […]Click for more information on 'What You See is Not What You Get: Strategies for Overcoming Bias in Evaluation'
Elena Pinzon O’Quinn, Learning and Evaluation Director, Latin American Youth Center Charles Riebeling, Deputy Director of Accountability, Carlos Rosario Intermediate Public Charter School This session will outline the steps taken by a non-profit organization to assess a data and evaluation system and decide how the system could be improved. We will present our lessons learned […]Click for more information on 'Assessment and Improvement of a Data System: Strategies and Lessons Learned from a Non-profit'
Katie Lu Clougherty, Program Data Coordinator, DC SCORES Lorena Palacios, Latino Engagement Coordinator, DC SCORES This presentation focuses on how an after-school non-profit internally utilizes the tool of critical reflexivity and formative evaluation to drive further improvement of the cultural responsiveness of program impact evaluation to improve educational equity.Click for more information on 'Using Formative Evaluation to Improve Cultural Responsiveness to Measure Program Impact'
Kristyn Stewart, Senior Research Associate, The School District of Philadelphia Katie Mosher, Research Specialist, The School District of Philadelphia This session will focus on the evaluation of the implementation and outcomes of Early Literacy improvement strategies in The School District of Philadelphia, including the research behind the strategy; the collection of data across 150 schools; […]Click for more information on 'Using Evidence to Inform and Evaluate Implementation of a District-Wide Early Literacy Improvement Strategy'
Daniel Light, Research Scientist, Education Development Center Marion Goldstein, Senior Research Associate, Education Development Center Ashley Lewis Presser, Research Scientist, Education Development Center Tiffany Maxon, Research Associate, Education Development Center Elizabeth Pierson, Research Associate, Education Development Center Min-Kyung S. Park, Research Associate, Education Development Center This presentation will discuss efforts, challenges, and opportunities involved in […]Click for more information on 'Crossing Boundaries to Assess Young Children’s Financial Literacy'
Are you a new evaluator or at a decision point in your career? Get advice from seasoned professionals in a relaxed setting. Several members of the EERS Board of Directors will be available to share their experiences and talk with you informally about your aspirations and how to chart your course.Click for more information on 'Graduate Student Network Opportunity: Career Talk with the Experts'
Another opportunity for networking, the poster session for student evaluators is a longtime tradition for the EERS conference. This year’s snack reception includes the following poster presenters, selected from submissions to EERS Board reviewers: Evaluating the Impact of Short-Term Study Abroad Program on Intercultural Competence Meng Fan, Doctoral Student in Evaluation, Measurement and Statistics, University […]Click for more information on 'Student Evaluator Poster Presentations and Networking Reception'
An opportunity to join EERS board members and conference attendees for an informal social and networking setting with snacks and refreshments.Click for more information on 'Hospitality Suite Networking'
Join conference attendees for a breakfast buffet and discussion led by Patrice Fenton. Research is often accepted as incremental in the way it cultivates change. So how do we use research to interrupt systemic inequity in a manner that is both efficient and impactful? This talk will illuminate why equity-based research agendas should take an identity-responsive approach to valuing and serving the collective. The public educational […]Click for more information on 'Featured Speaker Breakfast – “We are the Ones We’ve Been Waiting For…”: Research, Equity and Valuing the Collective'
Matthew Von Hendy, Principal, Green Heron Information Services This session will provide an overview of a systematic approach to finding the right Request for Proposal (RFP) opportunities. Topics to be covered include: searching FedBizOpps for federal evaluation RFPs, using commercial RFP-finding services for state and local government RFPs, negotiating terms of agreements, general search techniques […]Click for more information on 'Show Me the Money: Using A Systematic Research Approach to Finding RFPs (US Contracting Opportunities)'
David Bell, Associate Professor, Department of International Development, Clark University This presentation will problematize the range of teaching-and-learning methods used in the preparation of novice evaluators, particularly around issues of ethics and equity.Click for more information on 'Preparing Novice Evaluators: A Comparison of Pedagogical Models of Reflective Practice and Learning'
Thomas Archibald, Assistant Professor, Virginia Tech The lack of problem problematization in evaluation is problematic; this presentation puts forward Carol Bacchi’s ‘What’s the Problem Represented to Be?’ approach as a useful tool to focus evaluative thinking on the frequently tacit step of problem definition.Click for more information on 'What’s the Problem Represented to be? Problem Definition Analysis as a Tool for Evaluative Thinking'
Mark Johnson, Assistant Professor of Practice, Graduate Program in International Affairs, The New School This presentation will discuss how the international donor community “localization of aid” commitment to funding and capacity building of national and local organizations has not trickled down to most small organizations, whose M&E continues to be conducted by external evaluators rather […]Click for more information on '(Not) Localizing International Evaluation'
Mellie Torres, Ph.D, Visiting Assistant Professor, The Graduate Center at the City University of New York This presentation highlights a mixed-methods evaluation of an out of school time program focused on manhood development and violence prevention for Black and Latino high school males.Click for more information on 'Black and Latino Males Challenging Archetypal Masculinity in an Out of School Time Program'
Anne Gadomski, MD, MPH, Bassett Healthcare Network, Research Institute, Cooperstown, NY Christopher Wolf-Gould, MD, Bassett Healthcare Network, Gender Wellness Center, Oneonta, NY Carolyn Wolf-Gould, MD, Bassett Healthcare Network, Gender Wellness Center, Oneonta, NY Justine Woolner-Wise, LMSW, Bassett Healthcare Network, Gender Wellness Center, Oneonta, NY Patti Noon, LMSW Bassett Healthcare Network, Gender Wellness Center, Oneonta, NY […]Click for more information on 'Using an Implementation Science Framework to Evaluate a Center of Excellence for Transgender Healthcare'
Kirk Knestis, PhD, CEO, Hezel Associates This skill-building session will share the presenter’s “condition model” logic mapping approach, designed to knock down barriers inherent in typical modeling strategies to effectively guide planning and implementation in instances involving multifaceted interventions; complex mediating/moderating variables; outcomes for multiple, related stakeholder groups; or multi-site, multi-level implementation models. Participants will […]Click for more information on 'Virtuoso Logic Modeling: Making “Condition Models” Sing and Dance'
Rekha Shukla, Research Associate, Barbara Goldberg & Associates, LLC This presentation illustrates the case of an urban library, where the project planners’ intentions to provide a valuable resource to a community fell short, and how evaluation helped community voices reach administrators so that the project would be more accepted and effective.Click for more information on 'An Urban Library’s “RedBox:” Engaging the Machine to Make it Part of the Community'
Lauren Rice, Data Quality Coordinator, The Urban Alliance Susan Andrzejewski, Program Evaluation Coordinator, The Urban Alliance Urban Alliance evaluators share a target setting tool that instigates data driven decision making among frontline staff to improve programming for youth.Click for more information on 'Working Smarter Not Harder: Setting Better Targets with Proactive Evaluation'
Ashley Lewis Presser, Ph.D., Research Scientist, Education Development Center Maggie Scalli, Research Assistant, Middle Grades Career Mentors Project This presentation will focus on the formative evaluation of an intervention that seeks to improve middle grade students’ knowledge of, and interest in, pursuing STEM-related Career and Technical Education; in particular, how the implementation context changed from […]Click for more information on 'Redefining Implementation Context'
Jill Scheibler, PhD, Senior Research Analyst, Carson Research Consulting, Inc. Sheila Matano, MPH, Senior Research Analyst, Carson Research Consulting, Inc. During this session, we will discuss our experiences combining traditional and unconventional evaluation techniques to show the impact and tell the story of a nonprofit organization tackling maternal health inequities in Baltimore.Click for more information on 'When the Evidence is Hiding in Plain Sight: Using Adaptive Evaluation to Tell the Story of BHS'
Lisa M. Chauveron, RYTE Institute Doctoral Candidate, Montclair State University Using both process and mixed methods findings, this presentation will discuss strategies and benefits of enhancing the role of stakeholders in evaluation designs.Click for more information on 'Student Award Presentation – Including Underrepresented Stakeholder Voices in Evaluation: Yes, We Mean ALL of Them!'
Michael A. Harnar, Ph.D., Assistant Professor, Western Michigan University Juna Snow, Innovated Consulting, LLC Tara Lightner, Western Michigan University Jeffrey Hillman, Western Michigan University This session demystifies the concept of internal meta-evaluation and bring a more manageable description of practice improvement that provides a useful tool for practitioners, recognizing the very real possibility that evaluators […]Click for more information on 'Is Evaluator-focused Meta-evaluation Occurring or Just Theorized?'
Elli Travis, Economic Development Specialist, Virginia Tech This presentation will be an exploration of methods, successes, failures and ethics of evaluating the impact of research and extension efforts on the economy.Click for more information on 'The Polysemy of Impact at a Research 1 Land Grant University'
Three smart people talk about logic models. Logic models are broken. We talk about how to fix them. But the fixes are broken as well. Crazy new idea introduced. Magic happens. The audience is astonished because their conceptual reality of logic models is challenged. There are hushed whispers, doubting nods, and joyful exclamations of ‘YES!’ […]Click for more information on 'The Logic Model Repair Shop: Why Most Logic Models are Broken and How We Can Fix Them'
A wrap-up event for the conference, with buffet lunch. A panel of EERS Board members will facilitate a round robin discussion about the conference theme: equity, ethics, and evidence. What did we learn? What tensions continue to perplex us? Have we found new methods or tools to address these challenges?Click for more information on 'Buffet Luncheon and Round Table Discussion'
All conference participants are welcome.Click for more information on 'EERS Board Meeting'