Menu
Log in


2019 Fall Exchange: Driving Change

September 27, 2019 | Athens, OH

The theme for this year’s conferences is Driving Change which was identified as a natural evolution from last year’s theme of Effective Storytelling. If you participated in the spring exchange, one of the many messages you may have taken away is that evaluators using their work to drive change is not an ancillary task but is essential to demonstrate value of what we do.

In many ways the true power and promise of evaluation is determined by the extent to which findings are used. Because of this critical function, the conference was intentionally designed to create space to learn, engage, and reflect on how we can catalyze the use of evaluation knowledge.

We thank all of our presenters for their willingness to share their expertise. And we thank the attendees for their attendance.

We hope you had a wonderful day! 

 - Lana Rucks, The Rucks Group, OPEG President


To access presentation resources available through the Members Only Content!

Welcome: Marsha Lewis, PhD, Head Associate Dean at the Voinovich School of Leadership and Public Affairs

Marsha Lewis currently teaches in the MPA program and manages applied research projects related to education and public sector strategy development. She also serves as a senior data analyst for research and evaluation projects. Marsha helped develop the Ohio University Executive Leadership Institute and served as the Institute’s managing director for five years. She holds a Ph.D. in educational research and evaluation with concentrations in statistical analysis and psychometrics. Marsha is one of Ohio University's lead researchers in the Ohio Education Research Center. 

Introduction: Lana Rucks, PhD, The Rucks Group LLC & OPEG President

Lana Rucks, PhD, brings to her work more than two decades of professional history including over 15 years of research and evaluation experience. She has extensive professional and educational knowledge of designing and implementing research and program evaluation efforts. She has led dozens of evaluation initiatives funded by various federal agencies such as the Centers for Disease Control and Prevention, Department of Labor, National Institutes of Health, and National Science Foundation. Dr. Rucks has taught at Sinclair Community College, the University of Dayton, and Wright State University’s School of Professional Psychology. Dr. Rucks is a member of AEA and OPEG. She earned a doctorate and master’s degree in social psychology with a concentration in quantitative methods from The Ohio State University. She also earned a master’s degree in experimental psychology from the University of Dayton. At Ohio Wesleyan University she earned a bachelor of arts degree in psychology with a concentration in chemistry.

Keynote Speaker

Using a Mixed Methods Perspective & Large Datasets to Address the Tensions between Internal & External Validity: Implications for Program Evaluation in Rural Schools | John Hitchcock, PhD, Principal Associate, Social & Economic Policy 

Abstract: This address will present a continuation of aspirational thinking around the use of large datasets and the mixed methods paradigm to yield new ways to engage in program evaluation, and more broadly, social science research.  The address will begin with a primer on broad tensions between internal and external validity within the context of the evidence-based intervention movement in education. It will then shift to an overview of how different federal, state and privately funded datasets might begin to be merged and analyzed using visual approaches to develop a deep understanding of context, before, during and after an evaluation.  Such inquiry is conceptualized via a mixed methods perspective to take advantage of different ways in which social scientists engage in causal inference and generalize and transfer findings.  Finally, to establish the potential practicality of these ideas, their possible use will be described with respect to evaluations conduced in the context of rural K-12 schools.

John H. Hitchcock is a principal associate in the Social and Economic Policy division of Abt Associates. He has held tenured faculty appointments at Ohio University and at Indiana University. At each university, he also served as a director of a research center. He earned a PhD in educational psychology, a certificate of advanced study in applied research focusing on program evalua­tion, and a master’s degree in educational psychology and statistics from the University at Albany, State Univer­sity of New York. He earned a bachelor’s degree in psychology from the University of Evansville. Hitchcock focuses on developing mixed methods research applications, eval­uating interventions designed for children with special learning needs, and research syntheses. To date, he has served as a co–principal investigator on four random­ized controlled trials (RCTs), helped the U.S. Department of Education develop standards for evaluating the causal validity of single-case research designs, coauthored more than 50 scholarly works (peer-reviewed journal articles, books, national technical reports, and book chapters), and presented research at conferences more than 125 times. He is currently a co-PI on three RCTs, each of which are testing interventions for children with special learning needs and serves as a co-lead of the What Works Clearinghouse review of interventions designed to yield supportive leaning environments. Hitchcock has coauthored grant and contract applications that have led to more than $10 million in funding from federal, state, philanthropic, and international agencies. He has served as an associate editor or board member for School Psychology Review since 2011 and is currently co-editor in chief of the newly developed International Journal of Multiple Research Approaches. 

Presentations

Principles of Cost-Benefit Analysis: Measuring Social Bottom Line | Rob Moore, MPP, Scioto Analysis 

Abstract: Cost-benefit analysis is a key tool for assessing the value of a program. By quantifying costs and benefits of a program, the value of a program can be communicated to quantitatively-minded financiers and impacts of the program can be uncovered and catalogued. For these reasons, cost-benefit analysis is a powerful tool in the program evaluator’s toolkit.

This session will cover the basic best practices of cost-benefit analysis. It will provide a broad overview of monetization of direct and indirect costs and benefits, measurement against a baseline, disclosure of assumptions, discounting, and sensitivity analysis. Treatment of the topic will be focused on program evaluation, with discussion about the difference between public and private-sector evaluation.

The session will also touch on the topic of cost-effectiveness analysis, cost-benefit analysis’s cousin focused on more narrowly-defined outcomes. Cost-effectiveness analysis can be a good substitute for benefit-cost analysis when focusing on specific outcomes, particularly in health evaluation.

In addition, the session will incorporate examples so participants can see cost-benefit analysis in practice.

Rob Moore is the principal for Scioto Analysis. Rob has worked as an analyst in the public and nonprofit sectors and has analyzed diverse issue areas such as economic  development, environment, education, and public health. He is currently a board member for Gross National Happiness USA, a national grassroots organization promoting multidimensional measurement of well-being in society, and contributes a regular local politics column to Columbus Alive. Rob maintains memberships with the Association for Public Policy Analysis and Management and the Society for Benefit-Cost Analysis.

Before becoming an analyst, Rob was a community organizer in Omaha, Nebraska. He holds a Master of Public Policy from the University of California Berkeley’s Goldman School of Public Policy and a Bachelor of Arts in Philosophy from Denison University. He is also a registered parliamentarian with the National Association of Parliamentarians. In his free time, Rob enjoys travel, film, and synthpop and is active in Columbus’s improvisational comedy scene.

Using the Systems Iceberg as an Evaluation Tool to Explore Structure, Complexity, and Leverage Points for Change | Janice Noga, MEd, Pathfinder Evaluation and Consulting

Abstract: Systems thinking helps us as evaluators to understand the world in all its diversity in ways that are practical, comprehensive, and wise. Inspiring to say, but how do we as evaluators go about accomplishing such a goal? This demonstration session is for those of you who are interested in diving more deeply into the use of the systems iceberg as a tool and template for systems-informed evaluation practice. Using a real-life example drawn from K-12 evaluation, the presenter will step through the process of using the systems iceberg to explore structure and purpose, identify complex interactions and points of emergence, identify levers for change, analyze the results, and tell the story of the program in ways that help clients and stakeholders better understand the complex adaptive nature of the systems in which they are engaged.

Jan Noga is an independent evaluation consultant based in Cincinnati. She holds a bachelor's degree from Stanford University in developmental and counseling psychology with specialization in early and middle childhood and a master's degree from the University of Cincinnati in instructional design and technology. Ms. Noga has worked in the non-profit and public sectors in human services and

education for more than 30 years in roles spanning teaching, research, policy, and program planning and evaluation. As a program evaluator, Ms Noga has planned and conducted both large and small-scale evaluations and provided organizational consulting and capacity building support to clients. She has also taught courses and workshops on such topics as systems thinking, research methods and techniques, program planning and development, and survey design and analysis. Ms Noga is a frequent presenter at the annual meetings of the American Evaluation Association (AEA) and the American Educational Research Association (AERA). She is particularly interested in the use of systems approaches as a foundation for planning, implementation, and evaluation of change efforts in the human service and education arenas.

Mini-Presentations

Driving Change Through Collective Impact Evaluation | Elizabeth Pafford, MA, Sheri Chaney-Jones, MA, Measurement Resources Company

Abstract: Collective impact is garnering a great deal of attention, growing in scale and number globally. It’s attractive, providing social sector organizations opportunities to draw on each participating organization’s unique strengths and resources to tackle society’s most complex issues. Evaluators play an important role in collective impact. The key tasks of the evaluator are  developing a shared measurement framework and using data to frame important conversations and attract resources. Without examining data through an equity lens, the development, implementation and impact of a collective impact will not be as successful at producing meaningful change in a community. In this session, participants will interact with a shared measurement framework through content presentation and individual application.

Elizabeth provides assessment and consultation services to an array of health and human service agencies, local governments, state departments, universities and coalitions and collective impact groups. Areas of expertise include impact evaluation, mental health treatment, HIV/AIDS prevention and treatment, substance abuse treatment, offender reentry, youth development, arts, homelessness and community engagement.

Elizabeth has a Master of Arts in Public Affairs from the University of Missouri in Columbia, Missouri, and a Bachelor of Arts in Sociocultural Studies from Bethel University in Saint Paul, Minnesota.  Elizabeth sees the importance of evidence-based decision making in the social sector industry and is passionate about helping social sector leaders define and measure success.

With this academic and professional background, Elizabeth has the unique ability to help agencies develop meaningful, actionable strategic plans, benchmark performance against key organizational goals, and demonstrate impact.

Assessing Needs and Measuring Impact in Your Community | Amanda Klein, EdD, Structured Solutions Educational Consulting, LLC

Abstract: So often, practitioners who excel at service provision have not been trained in evaluation methods or data analysis, and they find themselves struggling to measure their own successes and areas of growth. Similarly, many new organizations or programs may know from experience that there are needs to address within communities, but they are unsure of how to “prove” those needs to funders or administrators. Fortunately, many trusted strategies for data collection, analysis, and evaluation are easy to administer and do not require advanced training. Moreover, the same strategies can be used in different ways for different types of evaluations, maximizing their utility. This workshop is designed for practitioners seeking to begin or enhance evaluation efforts in their community or organization to identify needs and determine the impact of interventions.

This workshop will provide participants with practical tips for getting started with assessment and evaluation in their communities. Participants will learn how many of the strategies used to assess community needs, such as surveys and qualitative interviews, can also be used to measure the success or impact of a community-based program. During this workshop, participants will learn how to use community mapping and publicly-available data sources to identify needs and assets in their communities, as well as how to utilize surveying and interviewing as tools to support all of their evaluation initiatives.

Amanda Klein is the owner of Structured Solutions Educational Consulting, LLC, established in 2015. She brings over 10 years of firsthand experience with students, families, school leaders, and district offices to her work supporting schools, districts, and non-profit organizations with engagement. In addition to her experience as a teacher, community school coordinator, and district support staff, Amanda Klein has spent numerous years building data collection and tracking systems, conducting trainings on data usage in engagement and attendance, and helping schools and organizations incorporate evaluation practices into their work. Through her academic pursuits, Dr. Klein has conducted original quantitative and qualitative research about a variety of issues related to students, teachers, families, and schools. Dr. Klein holds two degrees from Johns Hopkins University and a Doctorate of Education in K-12 Leadership and Policy from Vanderbilt University.

Think Tank: OPEG’s Mentoring Initiative | Seema Mahato, MBA, MPA, Ohio University, Deborah Wasserman, PhD, COSI Center for Research and Evaluation, Lana Rucks, PhD, The Rucks Group LLC 

Abstract: The growth of evaluation as a profession depends on a continuous inflow of qualified new evaluators. Beyond training and preparation in graduate schools or “accidental opportunities” at work, mentoring by experienced evaluators is an added advantage that new evaluators could benefit from as a result of their membership with local AEA affiliates. OPEG wishes to develop a mentoring program that will provide enriching professional experience to new evaluators. At the same time, this program will offer additional avenues for the experienced professionals to engage with OPEG as well as help shape the future of this profession i.e., mentor new evaluators.

This session is designed as a Think Tank where the presenters will share some initial insights on OPEG's mentoring program (that is an assimilation of insights from OPEG members) followed by a collaborative discussion on defining the characteristics of a mentoring program.

Developing the structure of OPEG’s mentoring initiative is a desired outcome of this session.

Seema Mahato is a program evaluator and analyst in the Office of Instructional Innovation and a Ph.D. candidate in educational studies, both at Ohio University. Her research interests encompass evaluation capacity building, community embedded research and social impact assessment.

It's complicated: Evaluating a hybrid collective impact-networked improvement community project | Caitlin Howley, PhD, Johnavae Campbell, PhD, Kimberly Cowley, PhD, ICF International

Abstract: In this presentation, we discuss an approach to evaluating a National Science Foundation-funded network focused on improving the persistence of rural, first-generation college students in science, technology, engineering, and math (STEM) programs in West Virginia. Entitled the First2 Network, this project combines the collaborative change elements of collective impact efforts (such as developing a common vision, establishing partnerships, coordinating communication and building leadership, sharing metrics, and planning for sustainability) with the processes of improvement science (such as using Plan-Do-Study-Act cycles to test improvements at participating universities). To first make sense of, and then evaluate, the First2 Network, our evaluation team focuses on four interrelated levels of activity and outcome: 1) the context in which the project operates, 2) Network structures and processes, 3) outcomes in the systems Network efforts target (e.g., STEM programs at West Virginia universities, public schools, informal STEM education sites, etc.), and 4) impact on student outcomes. Thus organized, the evaluation then employs a range of data collection methods to investigate each level. These include policy and media scans to keep apprised of contextual dynamics; quarterly surveys about how Network workgroups are using improvement science tools and processes; annual surveys and follow up interviews about the value of the Network to participants (progressing developmentally from the value of developing new relationships to, ultimately, creating changes in policy and practice); social network analysis; and review of persistence data over time.

Caitlin Howley has more than 20 years of experience providing education research, program evaluation, and technical assistance services to federal, state, local, and foundation clients. She currently directs the Appalachia Regional Comprehensive Center and evaluates K-12 and postsecondary programs for students in the Appalachian region. Howley’s specialties include rural education, STEM education, youth engagement, and organizational capacity building.

Strategies for Overcoming Data Collection Challenges | Maria Green Cohen, MA, JD, Kayla Galloway, PAST Foundation 

Abstract: Formative Program Evaluation involves ongoing adaptation of strategies for collecting meaningful qualitative and quantitative data, conducting analysis, and presenting a coherent and insightful story of program implementation and outcomes. Working with the project implementation team, evaluators are often presented with significant challenges in coordinating data collection with phases of implementation. In this context, evaluators can provide instrumental support for implementation in ways that can enhance reporting. How do you conduct evaluation in situations where data sources are difficult to capture? Dealing with the reluctant informant, identifying and communicating with research participants during phases of research can present a challenge when deadlines loom. At the PAST Foundation, we’ve experienced such situations, and as a result we have developed several effective strategies to overcome data collection challenges in K-12 education to gain rich and robust data. Challenges can intensify with multiple ongoing projects and limited staffing hours. For projects outside of Ohio and with multi-state programs, on-site data collection can be limited by time and funding. During this brief presentation, we will share several case studies where we’ve creatively expanded our capacity to collect relevant data, producing effective reporting that provides impactful stories for clients to demonstrate the value of projects.

The PAST Foundation is based in Columbus and works nationally in supporting K-16 transition to STEM Education. Maria Green Cohen, Assistant Director of Research, joined PAST in 2007. A graduate of Oberlin College, Maria received her M.A. in Folklore from the Folklore Institute at Indiana University and a J.D. from Indiana University. Maria joined PAST as an ethnographer for the study of the Metro Early College High School, the first STEM high school in Ohio. She is Knowledge Capture’s Assistant Director of Research, has contributed to more than fifty PAST Foundation research studies and publications, has experience as an attorney, and has worked with City Lore: The New York Center for Urban Folk Culture.

Kayla Galloway, Research Assistant, joined PAST in 2014 as a member of the Knowledge Capture team, PAST’s ethnographic research division, after completing a one-year KC internship as an Ohio State University undergraduate. Kayla received her B.A in Anthropology with a focus in Archaeology and Cultural Anthropology from The Ohio State University in 2015. Since that time, Kayla has contributed to major research projects evaluating Ohio STEM education programs as well as program implementation in multiple states. Kayla applies her ethnographic skills in creative ways, developing her work to focus on analysis and graphic presentation of evaluation data. She has contributed to dozens of evaluation reports during her career at the PAST Foundation.

Thank you to our Sponsors



"We prepare leader-educators, practitioners, and human service professionals who share our commitment to lifelong learning and serving society responsibly as change agents in meeting diverse human and social needs."

 - www.ohio.edu

“Ohio University’s Voinovich School of Leadership and Public Affairs blends real-world problem solving and government, nonprofit and industry partnerships with education to find researched-based solutions to challenges facing communities, the economy and the environment."

- www.ohio.edu/voinovichschool



“Whether you’re creating sales collateral, corporate identity sets, newsletters, or personalized direct mail, you need a partner you can trust. We have deep experience in both digital and offset printing techniques, as well as design principles and finishing touches. Today, Oregon is the leading print and marketing solutions provider serving businesses in Dayton, Ohio and beyond.”

- www.oregonprinting.com

“We are a research and evaluation firm that gathers, analyses, and interprets data to enable our clients to measure the impact of their work.”

- www.therucksgroup.com

 INNOVATION through REVELATION



©  2020 Ohio Program Evaluators' Group

www.OPEG.org

Powered by Wild Apricot Membership Software