Skip to main content

Evaluation 101 for Human Rights Organizations: Designing Evaluation Studies

, ,

Watch the recorded webinar:
Click here to download the webinar slideshow


This event focused on the key elements of designing a program evaluation.  The presenters shared how to define the purpose and utility of the evaluation; the relationship between evaluation questions and evaluation designs; and their experience designing and implementing evaluations using an array of approaches.

Presenter: Kelly Skeith, Senior Design, Monitoring and Evaluation Specialist, Freedom House
Kelly Skeith is the Senior Design, Monitoring and Evaluation Specialist at Freedom House where her duties include oversight of the design, monitoring and evaluation of all of the organization’s international human rights, democracy and governance programs. From 2010-15, Skeith served as an M&E Specialist at Social Impact, most recently serving as Deputy Director for Social Impact’s Performance Evaluation practice. Skeith provided management and technical oversight for all Social Impact evaluation, assessment, and performance monitoring activities for USAID and the Department of State and facilitated evaluation trainings for donors and other program staff around the world. Her technical work focused on the utilization of both qualitative and quantitative approaches to capture output, outcome, and impact level data to improve program and policy design and effectiveness in transitional, fragile, and conflict-affected areas. Skeith has also worked at USAID’s Office of Conflict Management and Mitigation (CMM) where she supported conflict assessments and the development of the Religion, Conflict and Peacebuilding toolkit. She holds an undergraduate degree in international business from James Madison University and a master’s degree in economic and political development from Columbia University.  

Moderator: Linda Stern, Director of Monitoring, Evaluation & Learning, National Democratic Institute
Linda Stern is the director of Monitoring, Evaluation & Learning (MEL) at NDI. An applied anthropologist by training, she is an expert in action research methods and participatory evaluation, often acting as lead evaluator on NDI's internal evaluations. As director of a small technical team, Stern provides NDI staff with the capacity to monitor and evaluate their programs, and equips NDI teams with the appropriate tools and guidelines for collecting, analyzing and reporting on programmatic data. Stern is responsible for ensuring that NDI has a strong body of evidence for the quality, effectiveness and impact of its democracy assistance programming. Stern joined NDI in 2007 with over 20 years of experience in the social justice sector, working in the U.S., Latin America, Central & Eastern Europe, Asia and Africa. Before joining NDI, Stern worked as an advocate for the rights of immigrants and political asylum seekers, managed a US congressional demonstration project on community-coalition building for the Centers for Disease Control , and served as Catholic Relief Services' regional advocacy advisor in the Balkans and Caucasus from 1999 to 2004. She has held joint positions at Georgetown University's Center for Social Justice and the Center for New Directions in Learning and Scholarship. Stern currently teaches at George Washington University's Elliott School of International Affairs where she developed the schools' first graduate professional skills course in monitoring and evaluation.

This was the second in a series of four workshops/webinars, titled "Evaluation 101 for Human Rights Organizations: How do we know if we are making a difference?", designed to provide foundational information on four topics:

1.   Insights into how change comes about and how to measure it
2.   Study designs employing such measures
3.   Data collection and analysis, Part I
4.   Data collection and analysis, Part II

This workshop/webinar series is a project of the AAAS Science and Human Rights Coalition. Team organizer is Oliver Moles, Ph.D.