Sun, 04/14/2013 to Tue, 04/16/2013

2013 EERS Conference 

The video series produced for the inaugural Eleanor Chelimsky Forum, with support from the Robert Wood Johnson Foundation, is available for viewing on the EERS YouTube Channel. Additional details about the Forum and videos are available here.

The 2013 conference was by all accounts a success. Thank you to all of the organizers, presenters, and attendees for your enthusiasm and support for this great event.  The EERS board would like to extend special thanks to the industry sponsors that supported the conference, helping to fund student activities and publication of the program - Carson Research, ICF International, and Research for Better Schools.

The final Conference Program is still available via the link provided at the bottom of this page, as are copies of presentations from conference sessions

EERS Conference Event Honors Eleanor Chelimsky

The Eastern Evaluation Research Society (EERS), through the generous support of the Robert Wood Johnson Foundation was able to initiate a new, groundbreaking offering. The Eleanor Chelimsky Forum honored Eleanor Chelimsky, one of the most insightful, influential and respected program evaluators of our era.

The Forum premiered at the EERS 2013 Annual Spring Conference (April 14-16). In future years, the Chelimsky Forum will feature luminaries in the field of program evaluation who will address and interact with conference participants on prominent challenges in the fit between theory and practice.

The admiration for Ms. Chelimsky was vivid when both Michael Quinn Patton and Thomas Schwandt, two of the field’s most prominent contributors, immediately accepted the EERS board's invitation to be the inaugural speakers at the Forum.

More details of the 2013 forum are available: Patton and Schwandt Present at Inaugural Eleanor Chelimsky Forum

 

The Eastern Evaluation Research Society held its 36th Annual Conference April 14-16, 2013 at its now-traditional home at the Seaview Resort and Spa in Absecon, New Jersey. EERS is pleased to be able to continue its longstanding support for professional evaluators with high quality offerings:

  • A forum for learning, networking, and sharing expertise in an intimate and welcoming atmosphere
  • The chance to learn new skills at multiple levels
  • Initiatives to nurture new evaluators and inspire seasoned evaluators
  • Opportunities for cross-disciplinary discussions about evaluation in both formal and informal contexts

As in past years, the conference provided a range of activities promoting interaction and exchange of ideas, as well as skill-building sessions, great keynote speakers, pre-conference workshops, the EERS Invited Authors Awardee program, and hosted meals and social activities.
 

Conference Activities

  • Sunday evening dessert reception
  • Monday continental breakfast
  • Monday keynote luncheon
  • Monday evening networking reception
  • Tuesday full breakfast buffet
  • Tuesday buffet luncheon
  • Snacks and beverages during conference breaks
     

Plenary Speakers

  • Susan Fuhrman (Keynote Speaker), President of Teachers College, Columbia University and President of the National Academy of Education (NEW - Video of Susan Fuhrman's talk is now available online)
  • Jody Fitzpatrick, Associate Professor in the University of Colorado Denver School of Public Affairs, and American Evaluation Association President-Elect
  • Rakesh Mohan, Director of the Office of Performance Evaluations, Idaho State Legislature, and Recipient of the 2011 AEA Alva and Gunnar Myrdal Government Evaluation Award

 

Pre-Conference Workshops

The EERS conference provides opportunities for evaluators to gain hands-on skills. The 2013 conference  again offered two pre-conference workshops, held on Sunday before the welcome dessert reception. Attendees were able to register for workshops online when registering for the conference. This year's pre-conference workshops included:

  • Using Logic Models to Support Implementation Fidelity: A Hands-on Workshop
    Jennifer Hamilton, Senior Study Director, Westat

    This workshop will provide a step by step guide to logic modeling and its importance in program evaluation. Participants will engage in this highly interactive workshop to create their own logic models and to explore how they can be used as a tool in measuring implementation fidelity. Jennifer Hamilton, Ph.D. is a Senior Study Director at Westat and specializes in evaluation methodology. She has lectured at the University of Georgia on measuring implementation fidelity and has a chapter on evaluation methodology forthcoming in a major textbook.
     
  • Communicating Data Clearly through Effective Graphs
    Naomi Robbins, Principal, NBR

    This workshop will assist participants to better understand the principles involved with presenting data through clear, concise and accurate graphs that are easy to understand; will emphasize how to avoid common mistakes that produce confusing or misleading graphs; and will examine the appropriate use of scales along which we graph data. Naomi Robbins, Ph.D. is a consultant and seminar leader who specializes in assisting individuals and organizations to present data effectively through the use of graphs. She is the author of Creating More Effective Graphs and blogs for Forbes.

 

Conference Forms and Handouts

This year's conference featured a variety of plenary speakers, invited authors, individual sessions, panels, and skill-building sessions. Details of the workshop training sessions are provided at the bottom of this page. Slides from presentations are available below, in the order they appear in the conference program.

AttachmentSize
2013 EERS Conference Program3.86 MB
CHELIMSKY FORUM - Introduction26.48 KB
CHELIMSKY FORUM - M. Q. Patton855.77 KB
CHELIMSKY FORUM - T. Schwandt265.63 KB
Gaspar & Henderson - Making the Most of Matching107.63 KB
Henderson, et al. - Balancing Rigor, Relevance, and Real-Word Constraints57.94 KB
Karakus, et al. - Hone, Learn, and Fine Tune275.15 KB
Evergreen - 10 Steps to Effective Data Communication141.62 KB
Evergreen - 10 Steps to Effective Data Communication Handout141.62 KB
Franco, et al. - Using Student Growth Measures in Ohio's Educator Evaluation Systems210.23 KB
Merriman - Increasing Teacher Quality304.87 KB
Honadle - The Mirror Has Two Faces493.82 KB
Sloan - Procuring Buy-In to the Evaluation Process1.7 MB
Jennings & Newell - Slicing and Dicing the Results222.05 KB
McKie - Closing the Evaluation Loop669.41 KB
Fuhrman - Lessons Learned about Connecting Research to Policy1.63 MB
Burton - Using Pivot Tables and Pivot Charts in Excel1.06 MB
Wasbes - Causal Loop Diagrams1.63 MB
Griswold - A Novel Reverse Logic Mapping Approach885.33 KB
Bernstein - Government Evaluation and Performance Measurement141.41 KB
Creel & Szoc - Serving Two Masters466.76 KB
Shipman (Invited Author) - The Role of Context in Valuing Federal Programs95.85 KB
Feighan & Kirtcheva - Serving Clients in Education when "Findings" are Elusive447.74 KB
Kaplan et al. - Data for Change in the Whirlwind World of Policy Advocacy & Program Implementation627.12 KB
Goldberg & Shukla - Stakeholders as Advocates for Evaluation1.06 MB
Merola et al. - Rethinking How the ABCs are Used to Target Dropout Prevention Programs587.55 KB
Uekawa - High School Dropouts535.17 KB
Passa & Porowski - Pregnancy and Dropout573.34 KB
Fabian - The Final Report84.47 KB
Leicht et al. - Child Welfare Information Gateway's Collaborative Evaluation865.27 KB
Valado - Facilitating the Use of Evaluation Data for Continuous Quality Improvement1.02 MB
Barra & Coffey - Using Formative Evaluation Findngs for Continuous Program Improvement1.17 MB
Cober - Sharing Results with Community Partners1.75 MB
Lewis Raymer - Evaluating Education1.53 MB
Clearfield & Trahan - Moving Beyond Limitations1.75 MB
Bisgaier & Sernyak - Information Roadblocks on the Health Care Reform Highway697.65 KB
Clarke & Kwong - Medicare Part D317.51 KB
Johns (Invited Author) - Evaluating NYC Smoke Free Parks and Beaches Law199.46 KB
Mohan - Making the Most of Evaluation Findings by Managing the Politics of Evaluation411.5 KB
Clarke & Long - Cognitive Interviewing476.41 KB
Marks, Barrett, Higgins Lynn & Campbell - Use of Mixed Methods in Formative Feedback for NC's RttT837.42 KB
Matano - Taking the Goldilocks Approach995.7 KB
Karras-Jean Gilles - What Factors Predict Reearch Practipant Engagement516.58 KB
Gurvey - Cutting Through the Data Fog373.88 KB
Rae - Narrowing the Evaluation Focus on a Myriad of MillionTreesNYC889.8 KB
Waldhoff & Pichert - Using Early Evaluation to Maximize Evaluation Impact on Government Programs273.43 KB
Culnane - Dynamics of College Peer Mentoring536.19 KB
Saleem & Medina - The Advocacy Progress Planner1.19 MB
Engelman & McKlin - Social Network Analysis for the Novice1.13 MB
LaChenaye - “Tu ne connais pas rien!”539.85 KB
Archibald - Evaluative Thinking1.93 MB
Byrne et al. - Results of a Mixed-Methods Analysis of Evluation Capacity Building1.11 MB
Arafeh & Quenoy - Evaluation and Data in a Learning Organization602.77 KB
Segal - Useful Dissertation573.2 KB