The Eastern Evaluation Research Society (EERS) is the oldest professional society for program evaluators in the United States. Founded in 1978, EERS remains a vibrant contributor to program evaluation practice and theory.   Read More »

2017: Evaluation Looking Forward: Transitioning from Past to Future

Sun, 04/02/2017 to Tue, 04/04/2017


The 40th Anniversary EERS Annual Conference will be convened April 2-4, 2017

The Eastern Evaluation Research Society (EERS) is pleased to announce that the organization's 40th Annual Conference will be returning to its historical home at the Seaview Resort and Spa in Absecon, New Jersey on April 2-4, 2017. Visit this site regularly for updated information about the conference and how evaluation professionals can be involved in this anniversary event. Consistent with this milestone in EERS history, the theme of this year's gathering is Evaluation Looking Forward: Transitioning from Past to FutureConference registration is now open online.


Workshop Cancellation

Unfortunately, we have had to cancel the Text Analysis workshop originally scheduled for presentation by Tony Fujs. Attendees who pre-resistered for that session should have been contacted regarding their options. If you have not, please email Kirk Knestis ( for assistance.


Conference Presentation Proposals

Thank you to everyone who submitted proposals for individual sessions, panels, and ignites, and particularly to students who proposed to present at the EERS conference. Proposals have been reviewed and notices sent to successful presenters.


EERS Conference Sponsors

The EERS annual conference has in past years enjoyed gracious financial support from sponsors related to the evluation industry. If you are interested in getting involved in that capacity, please email Tom Archibald (


Support EERS Conference Student Participation

EERS takes pride in encouraging the next generation of evaluators by supporting students whose presentation proposals are accepted with stipends, and free or subsidized meal and conference registration fees exceeding $4000 annually. You can make tax-deductable donations securely through PayPal or with a credit card, with all proceeds going to support costs for student participants. Click the button below to make a one-time $10 donation and thank you for supporting our students!



Pre-Conference Workshops

EERS has a long history of providing popular evaluation practitioner workshops immediately prior to the formal opening of the annual conference. The 2017 pre-conference trainings are listed at the bottom of this page, with links to additional information about sessions and presenters.


The Eleanor Chelimsky Forum

In its 5th year for the 2017 conference, the Chelimsky Forum on Evaluation Theory and Practice focused on the conference theme through the eyes of specially nominated figures in the evaluation field. Supported again by the Robert Wood Johnson Foundation, this year's Forum Plenary Presentation features longtime AEA and EERS contributors George Grob, President of the Center for Public Program Evalution; and Melvin Mark, of Pennsylvania State University.


Technology and Social Media for the EERS Conference

News about this year's EERS conference is being tweeted at #EERS17

Technology hardware and support will again be provided for the EERS conference, with specific guidance provided for conference presenters and conference attendees.


Conference Materials

Conference materials, including session presentations, will be uploaded and available below. Some presentation materials may have to be omitted due to restrictions imposed by funders or the need to protect evaluation participants' confidentiality.

2017 EERS Conference Call for Proposals453.54 KB
2017 EERS Call for Student Proposals109.45 KB
2017 EERS Conference Proposal Procedures354.84 KB
2017 EERS Conference Schedule-at-a-Glance Rev. 17 March383.87 KB
2017 EERS Conference Full Program692.27 KB
2017 EERS Conference Program Changes207.61 KB
CHELIMSKY FORUM: Grob - Evaluation Theory and Practice1.07 MB
CHELIMSKY FORUM: Mark - Discussant Response693.98 KB
Bernstein - Monitoring and Evaluating Inclusive Program Practices1.15 MB
Barnow - Evaluating Partnerships in Workforce Development Programs789.12 KB
Rog - Designing and Managing Multi-site Evaluations1.54 MB
Wholey - Evaluating Performance Partnerships327.35 KB
Samtani et al - Evaluating the Evaluators1.38 MB
STUDENT AWARD: Ulysse - Extending the Evaluator's Reach1.03 MB
STUDENT AWARD: Cerosaletti - Data-based Faculty Persona Development773.13 KB
Gunning - So Happy Together1022.19 KB
Weissenberg - Electronic Medical Records and the Health System691.13 KB
Hyde - They Say a Secure Future Requires a Diversified Portfolio790.77 KB
Robles & Nerino - Bringing the Community Back In1.38 MB
Zandniapour - The Social Information Fund National Assessment1.11 MB
Mitchell & Tetreault - Expanding the Evaluation Toolbos1.36 MB
Travis & Kuhn - Got 10 Weeks?1.39 MB
Chen - Interrupted Time Series Analysis for the Evaluation of Healthy Schools Program666.91 KB
Wang - Retention and Completion: A Meta-analysis of MOOC Interventions725.87 KB
Herrling - Tracking Youth for Evaluation on the South Side of Chicago332.14 KB
Powers - Tracking New York City Students from high School into College with StudentTracker951.82 KB
Weissenberg - Enhancing Participant Tracking Strategies1.91 MB
Gadomski et al - Evolving from Collaboration to Productivity2.67 MB
Roach & Laiyi - An Innovative Approach to Evaluating a National Network of Collective Impact Initiatives555.2 KB
Herman et al - Social Capital and Evaluation Design1.61 MB
PANEL: The Future of Place-based Evaluation554.82 KB
Matano et al - Beyond the 'EASY' Button2.17 MB
Matano et al - Beyond the 'EASY' Button Handout1.55 MB
IGNITE: Edmunds - Spicing It Up764.65 KB
IGNITE: Stroble - You Want me to Measure What?!2.28 MB
IGNITE: Mortillaro - What to Expect when Collecting Survey Data at Multi-language Programs1.5 MB
IGNITE: Smith - Weighing our Options494.71 KB
IGNITE: Turner - Evaluation of Summer Learning Los Prevention1.52 MB
IGNITE: Glassman - The Echo of Values and Discipline in Evaluation1.47 MB
Mark - Belts & Suspenders, O Rings, and Plan B1.1 MB
Long & MacDonald - Effective Approaches for Recruiting Hard-to-reach Research Subjects411.84 KB
Hamilton - The Advantages of Adding a Matched Group to an Existing Experimental Design358 KB
Villamil - Building Internal M&E Capacity to Better Evaluate Women Leadership's Outcomes182.63 KB
McBride - Imagine: Evaluation Sparking Social Change1.65 MB
McBride - Culturally Responsive Evaluation Handout514.39 KB
Johri & Hewitt - Going to BAT: Demonstrating the Interoperability of Blackboard Assessment Tools2.89 MB
Scuello & Zhu - Designing an Effective Well-matched Comparison Group Study238.96 KB
PANEL: Shifting Course to Stay on Course: A Learning Organization Reflects3.05 MB
PANEL: Rapid Cycle Learning: From Program Design to Implementation2.88 MB
Grim - Less is More: What Evaluation can Learn from Essentialism and Mindfulness14.59 MB
Archibald - Looking Back to Move Forward: Campbell's Experimenting Society Reconsidered611.14 KB
Zandniapour - Introducing the Impact Evaluability Assessment Tool1.19 MB
PANEL: Developing Organizational Learning around Emerging M&E Approaches2.86 MB
Blanc & Uzwiak - Developing Clarity about Process and Outcomes in a STEM Dissemination Center823.49 KB
Pirozzi & Siblini - Measuring the Un-quantifiable: Advisory Services & Policy Advice747.24 KB
Woolston - Student Award: Bridging the Divide between Research & Practice1.78 MB

Trainings at this Conference

Connect with EERS

 EERS on Twitter

 Register for the EERS Email List

Visit the EERS YouTube Channel

Tweet the 2017 Conference!