You are here

Disable VAT on Taiwan

Unfortunately, as of 1 January 2020 SAGE Ltd is no longer able to support sales of electronically supplied services to Taiwan customers that are not Taiwan VAT registered. We apologise for any inconvenience. For more information or to place a print-only order, please contact uk.customerservices@sagepub.co.uk.

Program Evaluation
Share
Share

Program Evaluation
Embedding Evaluation into Program Design and Development

Second Edition


June 2025 | SAGE Publications, Inc
This text aims to build evaluation capacity by increasing knowledge about evaluation and improving skills to conduct evaluations. The book’s embedded approach uses program theory to understand relationships between activities and objectives, logic modeling to represent the program’s theory, and an evaluation matrix to structure the evaluation within the program. The approach is systematic and focused on continuous improvement. The Second Edition adds topics suggested by users of the book, incorporates content that the author has added to her own classes, and covers emerging areas in evaluation since the publication of the first edition such as artificial intelligence and equity in evaluation. A companion website at http://edge.sagepub.com/Giancola2e includes a number of instructor resources including editable PowerPoint slides and assignments.
 
Preface
 
Acknowledgments
 
Digital Resources
 
About The Author
 
About the Contributors
 
Section I: Introduction
 
Chapter 1: Evaluation Matters
1.1 What is Evaluation?

 
1.2 Why Evaluate?

 
1.3 Values and Standards in Evaluation

 
1.4 Types of Evaluation

 
1.5 Internal and External Evaluation

 
1.6 Embedding Evaluation Into Programs

 
1.7 Textbook Organization

 
1.8 Chapter Summary

 
Key Terms

 
 
Chapter 2: History of Evaluation
2.1 The Evolution of Evaluation

 
2.2 The History of Ethics in Research and Evaluation

 
2.3 Common Threads and Current Issues in Evaluation

 
2.4 Chapter Summary

 
Key Term

 
 
Chapter 3: Evaluation Ethics
3.1 Ethics Defined

 
3.2 Research Ethics Guidelines and Legislation

 
3.3 IRB Protocols and Amendments

 
3.4 Ethical Responsibilities of Organizations

 
3.5 Ethical Responsibilities of Evaluators

 
3.6 Additional Considerations

 
3.7 Chapter Summary

 
Key Terms

 
 
Chapter 4: Evaluation Ideologies and Approaches
4.1 Inquiry and Ideology

 
4.2 Evaluation Ideology

 
4.3 Evaluation Design

 
4.4 Evaluation Approach

 
4.5 Embedded Evaluation

 
4.6 Chapter Summary

 
Key Terms

 
 
Section II: Embedded Evaluation – Planning and Design
 
Chapter 5: Define, Part 1
5.1 Embedded Evaluation

 
5.2 Understanding the Program

 
5.3 Delineating Goals and Strategies

 
5.4 Explaining the Program Theory

 
5.5 Determining Contextual Conditions

 
5.6 Program Theory and Other Theories

 
5.7 Considering Alternative Theories

 
5.8 Chapter Summary

 
Key Terms

 
 
Chapter 6: Define, Part 2
Chapter in Context

 
6.1 What is a Logic Model?

 
6.2 Creating the Logic Model

 
6.3 Using the Program’s Logic Model

 
6.4 More On Logic Models

 
6.5 Logic Model Cautions

 
6.6 Chapter Summary

 
Key Terms

 
 
Chapter 7: Plan, Part 1
7.1 Creating Evaluation Questions

 
7.2 Overarching Evaluation Questions

 
7.3 Embedding Evaluation Questions Into the Logic Model

 
7.4 Determining What Data to Collect

 
7.5 Creating the Evaluation Matrix

 
7.6 Chapter Summary

 
Key Terms

 
 
Chapter 8: Plan, Part 2
8.1 Attribution

 
8.2 Evaluation Design

 
8.3 Evaluation Methods and Tools

 
8.4 Evaluation Matrix: Identifying Data Sources

 
8.5 Chapter Summary

 
Key Terms

 
 
Section III: Embedded Evaluation – Implementation and Use
 
Chapter 9: Implement, Part 1
Chapter in Context

 
9.1 Informed Consent

 
9.2 Collecting The Data

 
9.3 Organizing Quantitative Data

 
9.4 Organizing Qualitative Data

 
9.5 Special Considerations for Mixed Methods

 
9.6 Chapter Summary

 
Key Terms

 
 
Chapter 10: Implement, Part 2
Chapter in Context

 
10.1 Quantitative Data Analysis: Descriptive Statistics

 
10.2 Quantitative Data Analysis: Inferential Statistics

 
10.3 Quantitative Data Analysis: Advanced Statistical Methods

 
10.4 Qualitative Data Analysis

 
10.5 Mixed Method Integrative Analysis

 
10.6 Managing the Unexpected and Unintended

 
10.7 Chapter Summary

 
Key Terms

 
 
Chapter 11: Interpret
11.1 The Home Stretch

 
11.2 Examining Results

 
11.3 Interpreting Results

 
11.4 Communicating Evaluation Results

 
11.5 Enhancing Reporting and Communication

 
11.6 Chapter Summary

 
Key Terms

 
 
Chapter 12: Inform and Refine
12.1 Purpose of Evaluation

 
12.2 Pre-Evaluation: Efforts to Promote Utilization

 
12.3 During Evaluation: Ongoing Utilization Efforts

 
12.4 Post-Evaluation and Data Dissemination

 
12.5 Some Final Thoughts

 
12.6 Chapter Summary

 
Key Terms

 
 
Section IV: Resources
 
Chapter 13: Case Study Applications
13.1 LEND Evaluation

 
13.2 ACCEL Evaluation

 
13.3 YAP Evaluation

 
 
Chapter 14: Logic Model Examples
14.1 Birth-3 Early Intervention Screening Program

 
14.2 Gender Equity in STEM Academic Professions Program

 
14.3 Graduate Pipeline to Diversify the STEM Workforce Program

 
14.4 K–3 Cybersecurity Awareness Program

 
14.5 Higher Education Cybersecurity Program

 
14.6 Mental Health Services Program

 
 
Appendices Special Topics
 
Appendix A: An Integrated MERLA (Monitoring, Evaluation, Research, Learning, and Adapting) Framework for Evidence-Based Program Improvement
 
Appendix B: Community Needs Assessment Among Latino Families in an Urban Public Housing Development
 
Appendix C: Leveraging Artificial Intelligence to Advance Implementation Science: Potential Opportunities and Cautions
 
Appendix D: How Mixed-Methods Research Can Improve the Policy Relevance of Impact Evaluations
 
Appendix E: Learning, Unlearning, and Sprinkling In: Our Journey with Equitable Evaluation
 
References

For instructors

Select a Purchasing Option


Paperback
ISBN: 9781071918289
£114.00