You are here

RealWorld Evaluation
Share
Share

RealWorld Evaluation
Working Under Budget, Time, Data, and Political Constraints

Third Edition


July 2019 | 568 pages | SAGE Publications, Inc
RealWorld Evaluation: Working Under Budget, Time, Data, and Political Constraints addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints. The book is organized around the authors’ seven-step model that has been tested in workshops and practice environments to help the evaluation implementers and managers make the best choices when faced with real world constraints. The Third Edition includes a new chapter on gender equality and women’s empowerment and discussion of digital technology and data science.
 
List of Boxes, Figures, and Tables
 
List of Appendices
 
Foreword by Jim Rugh
 
Preface
 
Acknowledgments
 
About the Authors
 
PART I • THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACH
 
Chapter 1 • Overview: RealWorld Evaluation and the Contexts in Which It Is Used
1. Welcome to RealWorld Evaluation

 
2. The RealWorld Evaluation Context

 
3. The Four Types of Constraints Addressed by the RealWorld Approach

 
4. Additional Organizational and Administrative Challenges

 
5. The RealWorld Approach to Evaluation Challenges

 
6. Who Uses RealWorld Evaluation, for What Purposes, and When?

 
Summary

 
Further Reading

 
 
Chapter 2 • First Clarify the Purpose: Scoping the Evaluation
1. Stakeholder Expectations of Impact Evaluations

 
2. Understanding Information Needs

 
3. Developing the Program Theory Model

 
4. Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design

 
5. Developing Designs Suitable for RealWorld Evaluation Conditions

 
Summary

 
Further Reading

 
 
Chapter 3 • Not Enough Money: Addressing Budget Constraints
1. Simplifying the Evaluation Design

 
2. Clarifying Client Information Needs

 
3. Using Existing Data

 
4. Reducing Costs by Reducing Sample Size

 
5. Reducing Costs of Data Collection and Analysis

 
6. Assessing the Feasibility and Utility of Using New Information Technology (NIT) to Reduce the Costs of Data Collection

 
7. Threats to Validity of Budget Constraints

 
Summary

 
Further Reading

 
 
Chapter 4 • Not Enough Time: Addressing Scheduling and Other Time Constraints
1. Similarities and Differences Between Time and Budget Constraints

 
2. Simplifying the Evaluation Design

 
3. Clarifying Client Information Needs and Deadlines

 
4. Using Existing Documentary Data

 
5. Reducing Sample Size

 
6. Rapid Data-Collection Methods

 
7. Reducing Time Pressure on Outside Consultants

 
8. Hiring More Resource People

 
9. Building Outcome Indicators Into Project Records

 
10. New Information Technology for Data Collection and Analysis

 
11. Common Threats to Adequacy and Validity Relating to Time Constraints

 
Summary

 
Further Reading

 
 
Chapter 5 • Critical Information Is Missing or Difficult to Collect: Addressing Data Constraints
1. Data Issues Facing RealWorld Evaluators

 
2. Reconstructing Baseline Data

 
3. Special Issues Reconstructing Baseline Data for Project Populations and Comparison Groups

 
4. Collecting Data on Sensitive Topics or From Difficult-to-Reach Groups

 
5. Common Threats to Adequacy and Validity of an Evaluation Relating to Data Constraints

 
Summary

 
Further Reading

 
 
Chapter 6 • Political Constraints
1. Values, Ethics, and Politics

 
2. Societal Politics and Evaluation

 
3. Stakeholder Politics

 
4. Professional Politics

 
5. Political Issues in the Design Phase

 
6. Political Issues in the Conduct of an Evaluation

 
7. Political Issues in Evaluation Reporting and Use

 
8. Advocacy

 
Summary

 
Further Reading

 
 
Chapter 7 • Strengthening the Evaluation Design and the Validity of the Conclusions
1. Validity in Evaluation

 
2. Factors Affecting Adequacy and Validity

 
3. A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Method Designs

 
4. Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations

 
5. Assessing Adequacy and Validity for Qualitative Impact Evaluations

 
6. Assessing Validity for Mixed-Method (MM) Evaluations

 
7. Using the Threats-to-Validity Worksheets

 
Summary

 
Further Reading

 
 
Chapter 8 • Making It Useful: Helping Clients and Other Stakeholders Utilize the Evaluation
1. What Do We Mean by Influential Evaluations and Useful Evaluations?

 
2. The Underutilization of Evaluation Studies

 
3. Strategies for Promoting the Utilization of Evaluation Findings and Recommendations

 
Summary

 
Further Reading

 
 
PART II • A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPER
 
Chapter 9 • Standards and Ethics
1. Standards of Competence

 
2. Professional Standards

 
3. Ethical Codes of Conduct

 
4. Issues

 
Summary

 
Further Reading

 
 
Chapter 10 • Theory-Based Evaluation and Theory of Change
1. Theory-Based Evaluation (TBE) and Theory of Change (TOC)

 
2. Applications of Program Theory in Program Evaluation

 
3. Using TOC in Program Evaluation

 
4. Designing a Theory of Change Evaluation Framework

 
5. Integrating a Theory of Change Into the Program Management, Monitoring, and Evaluation Cycle

 
6. Program Theory Evaluation and Causality

 
Summary

 
Further Reading

 
 
Chapter 11 • Evaluation Designs: The RWE Strategy for Selecting the Appropriate Evaluation Design to Respond to the Purpose and Context of Each Evaluation
1. Different Approaches to the Classification of Evaluation Designs

 
2. Assessing Causality Attribution and Contribution

 
3. The RWE Approach to the Selection of the Appropriate Impact Evaluation Design

 
4. Tools and Techniques for Strengthening the Basic Evaluation Designs

 
5. Selecting the Best Design for RealWorld Evaluation Scenarios

 
Summary

 
Further Reading

 
 
Chapter 12 • Quantitative Evaluation Methods
1. Quantitative Evaluation Methodologies

 
2. Experimental and Quasi-Experimental Designs

 
3. Strengths and Weaknesses of Quantitative Evaluation Methodologies

 
4. Applications of Quantitative Methodologies in Program Evaluation

 
5. Quantitative Methods for Data Collection

 
6. The Management of Data Collection for Quantitative Studies

 
7. Data Analysis

 
Summary

 
Further Reading

 
 
Chapter 13 • Qualitative Evaluation Methods
1. Design

 
2. Data Collection

 
3. Data Analysis

 
4. Reporting

 
5. Real-World Constraints

 
Summary

 
Further Reading

 
 
Chapter 14 • Mixed-Method Evaluation
1. The Mixed-Method Approach

 
2. Rationale for Mixed-Method Approaches

 
3. Approaches to the Use of Mixed Methods

 
4. Mixed-Method Strategies

 
5. Implementing a Mixed-Method Design

 
6. Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved

 
7. Case Studies Illustrating the Use of Mixed Methods

 
Summary

 
Further Reading

 
 
Chapter 15 • Sampling Strategies for RealWorld Evaluation
1. The Importance of Sampling for RealWorld Evaluation

 
2. Purposive Sampling

 
3. Probability (Random) Sampling

 
4. Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation

 
5. The Contribution of Meta-Analysis

 
6. Sampling Issues for Mixed-Method Evaluations

 
7. Sampling Issues for RealWorld Evaluation

 
Summary

 
Further Reading

 
 
Chapter 16 • Evaluating Complex Projects, Programs, and Policies
1. The Move Toward Complex, Country-Level Development Programming

 
2. Defining Complexity in Development Programs and Evaluations

 
3. A Framework for the Evaluation of Complex Development Programs

 
Summary

 
Further Reading

 
 
Chapter 17 • Gender Evaluation: Integrating Gender Analysis Into Evaluations
1. Why a Gender Focus Is Critical

 
2. Gender Issues in Evaluations

 
3. Designing a Gender Evaluation

 
4. Gender Evaluations With Different Scopes

 
5. The Tools of Gender Evaluation

 
Summary

 
Further Reading

 
 
Chapter 18 • Evaluation in the Age of Big Data
1. Introducing Big Data and Data Science

 
2. Increasing Application of Big Data in the Development Context

 
3. The Tools of Data Science

 
4. Potential Applications of Data Science in Development Evaluation

 
5. Building Bridges Between Data Science and Evaluation

 
Summary

 
Further Reading

 
 
PART III • MANAGING EVALUATIONS
 
Chapter 19 • Managing Evaluations
1. Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations

 
2. Planning and Managing the Evaluation

 
3. Institutionalizing Impact Evaluation Systems at the Country and Sector Levels

 
4. Evaluating Capacity Development

 
Summary

 
Further Reading

 
 
Chapter 20 • The Road Ahead
1. Conclusions

 
2. Recommendations

 
 
Glossary of Terms and Acronyms
 
References
 
Author Index
 
Subject Index

Supplements

Instructor Teaching Site
study.sagepub.com/bamberger3e

Password-protected Instructor Resources include the following:
  • Editable, chapter-specific Microsoft® PowerPoint® slides offer you complete flexibility in easily creating a multimedia presentation for your course. 
  • Lecture Notes, including Outline and Objectives, which may be used for lecture and/or student handouts.
  • Case studies from SAGE Research Methods accompanied by critical thinking/discussion questions.  
  • Tables and figures from the printed book are available in an easily-downloadable format for use in papers, hand-outs, and presentations.
Open-access Student Resources include case studies from SAGE Research Methods accompanied by critical thinking/discussion questions, and an appendix of more than 250 pages of tables, figures, text and case studies providing more detailed documentation on most of the chapters. 
 

“This book moves the study of evaluation from the theoretical to the practical, so that evaluators can improve their work. It deals with most of the real issues that evaluators face, particularly at the international level.”

John Mathiason
Cornell Institute for Public Affairs

“This is one of the most practical textbooks in the field of evaluation that I have encountered. Its recognition of the limitations that affect program evaluation provides students with a realistic understanding of the difficulties in conducting evaluations and how to overcome these difficulties.”

David C. Powell
California State University, Long Beach

RealWorld Evaluation moves forward from where other evaluation textbooks stop. RWE challenges the evaluator to ask the difficult questions that can impact the design, implementation, and utilization of the evaluation. RWE then leads the reader through how to find efficient solutions to minimize these constraints.”

Karen McDonnell
Milken Institute School of Public Health

RealWorld Evaluation is a must-read for students of program evaluation-the framework and emphasis on practical constraints makes it an invaluable tool for learning the art and science of public policy.

Amanda Olejarski
West Chester University

“This is an invaluable resource for both novice and experienced evaluators. It contains a variety of tools and recommendations to successfully design and implement effective evaluations for any size and type of program.”

Sebastian Galindo
University of Florida

“Any research class focusing on real-world evaluation should start with this text; it is comprehensive, well-organized, well-written, and thoroughly practical.”

Jeffrey S. Savage
Cornerstone University

An update of the previous edition that was already a reference in the field of public policy evaluation and which remains a reference manual in the matter. Approach very “problem-focused”, well organized and objective in the content and with a strong practical use, without losing the rigorous theoretical framework. A true “all-terrain tool box” to address the practical and real challenges of public policy evaluation.

Professor Sérgio Vital Braz Caramelo
Business Administration, Instituto Universitário de Lisboa (ISCTE-IUL)
December 17, 2022

I chose to stick with Reginald York's Social Work Research Methods: Learning by Doing because it is a more hands-on approach to practice and program evaluation than RealWorld Evaluation. The course for which this is needed is a master's level research-based evaluation course. Therefore, I need to integrate actual statistical familiarity (SPSS) alongside learning about evaluation and its techniques, which York's text does nicely.

Dr Christopher Shar
Psych/Soc/Social Work, Angelo State University
October 16, 2019

For instructors

Select a Purchasing Option

SAGE Research Methods is a research methods tool created to help researchers, faculty and students with their research projects. SAGE Research Methods links over 175,000 pages of SAGE’s renowned book, journal and reference content with truly advanced search and discovery tools. Researchers can explore methods concepts to help them design research projects, understand particular methods or identify a new method, conduct their research, and write up their findings. Since SAGE Research Methods focuses on methodology rather than disciplines, it can be used across the social sciences, health sciences, and more.

With SAGE Research Methods, researchers can explore their chosen method across the depth and breadth of content, expanding or refining their search as needed; read online, print, or email full-text content; utilize suggested related methods and links to related authors from SAGE Research Methods' robust library and unique features; and even share their own collections of content through Methods Lists. SAGE Research Methods contains content from over 720 books, dictionaries, encyclopedias, and handbooks, the entire “Little Green Book,” and "Little Blue Book” series, two Major Works collating a selection of journal articles, and specially commissioned videos.