Evaluation in Today’s World
Реклама. ООО «ЛитРес», ИНН: 7719571260.
Оглавление
Veronica G. Thomas. Evaluation in Today’s World
Praise for the Book
Evaluation in Today’s World
Brief Table of Contents
Detailed Table of Contents
Preface
Who We Are
Contents of the Book and Challenging Assumptions
Organization and Pedagogical Features
Teaching Resources
About the Authors
Acknowledgments
Chapter 1 Evaluations of Future: Inclusive, Equity-Focused, Useful, and Used
Introduction
An Overview of the Book
Structure of the Book
Chapter Content
An Overview of Evaluation
Definitions of Evaluation
Evaluation Characteristics
Activity: Defining Evaluation
Evaluative Thinking
Activity: Applying Evaluative Thinking
Identify and Challenge Assumptions & Assertions
Seek Out Blind Spots
Capture Musings & Learning Questions
Race, Racism, Social Justice, and a Racialized Perspective
Other Social Justice Issues
Objectivity and Bias. Objectivity
Bias. Explicit Bias
Reflect and Discuss: My Biases
Implicit Bias
Activity: Take the Implicit Association Test (Optional)
Voices From the Field
Reducing Bias
Activity: Reflections on Working With White People and Antiracism
Culture, Cultural Competence, and Cultural Responsiveness
The Impact of Politics
Case Studies: Evaluation Results vs. Politicians. 21st Century Community Learning Centers
Project DARE (Drug Abuse Resistance Education)
The Current Climate
Activity: Validating “Truth”
Summary
Supplemental Resources. Bruner Foundation: Effectiveness Initiatives
Equitable Evaluation Initiative
Evaluation Resource Hub: Evaluative Thinking
National Parenting Education Network: Introduction to Evaluation
Chapter 2 Evaluation Ethics and Quality Standards
Introduction
Reflect and Discuss: What Ethics Means to You
A Brief Historical Perspective on Research Ethics
The Nuremberg Code of 1947
The Tuskegee Syphilis Study of 1932–1972
Reflect and Discuss: The Tuskegee Timeline
The Study Begins
What Went Wrong?
The Study Ends and Reparation Begins
Guiding Questions
The Radiation Studies of 1940–1960
The HeLa Story: 1950s and Beyond
Beyond Medical Studies and Physical Harm: The Milgram Study of 1963
Reflect and Discuss: Ethical Considerations and Authority Figures
The National Research Act of 1974
The Continuing Importance of Research Ethics
Ethics in Evaluation
Case Study: Identifying Hidden Agendas and Ethical Land Mines
The Case of the Sensitive Survey
Sources of Ethical Thinking
Reflect and Discuss: Evaluator Sources for Ethical Thinking
Case Study: Moving Beyond Past Experience
Cultural Competence as an Ethical Imperative
Reflect and Discuss: Self-Exploration
Ethical Dimension of Racial Bias
Case Study: Raising the Issue of Racism in Evaluation of a Program
Mathison’s Description of the Case
Ethical Sensitivity and Dilemmas
Activity: Ethical “Blind Spots” in Evaluation
Case Study: The Compromised Evaluator?
Sources of Ethical Dilemmas
Handling Ethical Dilemmas
Case Study: Revising the Evaluation Report
Scenario
What is the ethical course of action?
Reflect and Discuss: You Didn’t Hear It From Me!
The Situations
Questions for Discussion
Ethics and Conflicts of Interest
Ethical Challenges and Dilemmas Across the Evaluation Process
Case Study: Ethical Challenges Commonly Reported by Evaluators. Entry/Contracting Phase
Designing the Evaluation Phase
Data Collection Phase
Data Analysis and Interpretation Phase
Communication of Results Phase
Utilization of Results Phase
Ethical Principles and Standards for Evaluators and Evaluations
The Evaluators’ Ethical Guiding Principles
Case Study: Application of the Evaluators’ Ethical Guiding Principles
The Program Evaluation Standards
Voices From the Field
Evaluation Corruptibility and Fallacies
Evaluator Role, Power, Politics, and Ethics
Interplay of Politics and Ethics
Case Studies of: Political Power Plays in Evaluation With Ethical Ramifications. Political Power Plays Engaged in by Evaluatees
Political Power Plays Engaged in by Evaluators
Political Power Plays Engaged in by Other Stakeholders
Summary
Supplemental Resources. Practical Strategies for Culturally Competent Evaluation
The Belmont Report
“Human Subjects”
Protection of Human Subjects in Research
Human Subjects Research (HSR)—CITI Program
Web Links to Ethical Principles and Quality Standards
Descriptions of Images and Figures
Chapter 3 Historical Evolution of Program Evaluation Through a Social Justice Lens
Introduction
History of Evaluation Through a Social Justice Lens
Evaluation Prior to Modern Times of the 20th Century
Intersection Between Education and Evaluation Pre–20th Century
Early Social Experiments
Overview of Evaluation in the 20th Century
Evaluation in the First Half of the 20th Century: 1900–1950s
Evaluation During the New Deal, Wartime, and Economic Growth: 1930s–1950s
The Cambridge-Somerville Youth Program Evaluation
Case Study: Example of Experimental Evaluation in the 1930s: The Cambridge-Somerville Youth Study
Program Description
Method and Procedures
Selected Results
Reflect and Discuss
Sputnik’s Impact on the Growth of Evaluation
Prominent Influencers and Users of Evaluation During the 20th-Century Early Years: 1930s–1950s
Kurt Lewin
Alva and Gunnar Myrdal
Ralph W. Tyler
Hidden Figures and Histories in Early-20th-Century Evaluation
Voices From the Field
Ambrose Caliver
Reid E. Jackson
Rose Butler Browne
Aaron A. Brown
Leander L. Boykin
Journal of Negro Education and Founding Editor Charles H. Thompson
Reflect and Discuss: Pre–Brown v. Board of Education (1954) Evaluation Publications in the Journal of Negro Education
Evaluation in 1960–2000
Federal Legislation and Great Society Programs
The Professionalization of the Field
Growth of Evaluation Scholarship
Establishment of Professional Societies in Evaluation
Graduate Training and Professional Development in Evaluation
Establishment of Standards and Codes of Conduct
Methodological Approaches and Paradigm Wars
Two Influential Scholars’ Contributions to Methodological Approaches of the 1960s–1970s
Donald T. Campbell
Lee J. Cronbach
Rethinking the Role of Evaluation
Reflect and Discuss: The Perry Preschool Project
Brainstorming Questions
Influential Women in Evaluation: 1970s–1990s
Carol H. Weiss
Yvonna S. Lincoln
Eleanor Chelimsky
Floraline I. Stevens
Lois-Ellin Datta
Laura Leviton
Beatriz Chu Clewell
Influential 20th-Century Evaluator: An Activity
Activity: Research an Influential Person or Event
21st-Century Evaluation: Expanding the Focus
Strengthening Evaluation at the Federal Level
Case Study: Evaluation Activities of the Centers for Disease Control and Prevention
Shift in the Quantitative–Qualitative Debate
Increased Emphasis on Social Justice and Diversity
Support for Capacity Building
Summary
Supplemental Resources. Major Evaluation Journals
CDC Evaluation Resources
University Programs in Evaluation
Descriptions of Images and Figures
Chapter 4 Evaluation Paradigms, Theories, and Models
Introduction
The Value of Scientific Paradigms and Theories in Evaluation
The Nature of Scientific Paradigms
Reflect and Discuss: Paradigms Shaping Your Everyday Worldview
Theories for Guiding and Improving Evaluation Practice
Social Science Paradigms and Theories. Social Science Paradigms
Application of Social Science Paradigms in Evaluations
Case Study: Evaluating the Same Project Using Different Social Science Paradigms. Project Description
Positivism/Postpositivism
Interpretivism/Social Constructivism
Pragmatism
Critical Theory
Activity: Applying Paradigms to Evaluation Study
Project Rationale and Overview
Social Science Theories
Activity: Interface Among Social Science Theory, Social Programming, and Evaluation
Program Theory of Change
Activity: School Violence Reduction Program Theory of Change
Evaluating Program Theory
Evaluation Theories, Models, and Approaches
Why Should We Care About Evaluation Theory?
Voices From the Field
Distinguishing Evaluation Theories, Models, and Approaches
Classifying Evaluation Approaches and Theories
Five-Category Classification
Activity: Categorizing Evaluation Studies
The Evaluation Tree
Voices From the Field
Mertens and Wilson’s Four-Branch Tree of Evaluation Approaches
Evaluation Theories Within a Cultural Context
Evaluation Approaches and Theories: A Summary Description of Selected Examples
Summary
Supplemental Resources. Theory of Change: A Practical Tool for Action, Results, and Learning
Using Theory of Change to Design and Evaluate Public Health Interventions: A Systematic Review
Theory of Change
Descriptions of Images and Figures
Chapter 5 Social Justice and Evaluation: Theories, Challenges, Frameworks, and Paradigms
Introduction
Social Justice
Definitions of Social Justice
Reflect and Discuss: Defining Social Justice
Marginalized Groups
Impacts of Marginalization
Fixing the Group vs. Fixing the System
Activity: Finding My Privilege and Oppression
Theories Providing Context for Social Justice Evaluations
Critical Race Theory
Feminist Research and Theory
Case Study: Being Heard
Queer Theory
Disability Theory
Case Study: The Jerry Lewis Muscular Dystrophy Telethon
Reflect and Discuss: Medical Condition or Culture or Both
Race, Racism, and Evaluation
Challenges to Social Justice and Evaluation. Traditional Definitions of Rigor
Deficit Models
Cultural Conflict of Interest
Efforts to Reduce the Impact of Racism on Evaluation
Cultural Competence and Cultural Responsiveness
Reflect and Discuss: What Would You Do?
Evaluation Models and Social Justice
Social Justice–Oriented Evaluation Frameworks and Paradigms
Transformational Evaluation
Voices From the Field
Reflect and Discuss: Applying a Transformational Paradigm
Empowerment Evaluation
Case Studies: Empowerment Evaluation Around the World
Feminist Evaluation
Participatory Evaluation
Case Study: Students as Evaluators
Deliberative Democratic Evaluation
Collaborative Evaluation
Equity-Focused Evaluation
Case Study: Equity-Focused Evaluation Case Studies
Voices From the Field
Summary
Supplemental Resources. Resources for Culturally Responsive Evaluation
EvalIndigenous
BetterEvaluation
Social Justice Theory Resources. Critical Race Theory
Feminist Theory
Queer Theory
Disability Theory
Descriptions of Images and Figures
Chapter 6 Evaluation Types With a Cultural and Racial Equity Lens
Introduction
Classifying Evaluations
An Overview of Formative and Summative Classification
Reflect and Discuss: Optimal Circumstances for a Formative vs. Summative Evaluation
Distinguishing and Coupling Formative and Summative Evaluations
Other Evaluation Classifications
Different Types of Evaluations
Voices From the Field
Formative and Implementation Evaluations
Needs Assessments
Case Study: Example of a Needs Assessment Statement of Purpose
Delivery System Reform Incentive Program Brief Description
Purpose of Needs Assessment
Evaluability Assessments
Reflect and Discuss: Six-Step Process for Conducting an Evaluability Assessment
Questions for Discussion
Process Evaluations
Case Study: Process Evaluation of Teen Pregnancy and Parenting Program
Progress Evaluations
Activity: Planning a Formative Evaluation
Questions for Discussion
Summative, Outcome, and Impact Evaluation Types
Outcome Evaluations
Reflect and Discuss: Outcome Evaluation of Teen Pregnancy and Parenting Program
Case Study: Strong Through Every Mile Program Examples of Short-Term, Intermediate, and Long-Term Results
Impact Evaluations
Efficiency Evaluations
Reflect and Discuss: Importance of Detecting Cost Benefits
The Cost of Saving Lives: An Example
Question for Discussion
Case Study: CDC Cost-Effectiveness Evaluations. Intervention Is More Effective and Less Costly
Childhood Vaccination Program Example
Intervention Is More Effective and More Costly
Sexually Transmitted Diseases Treatment Example
Alternative Types of Evaluations
Rapid Evaluations
Case Study: Rapid Evaluations
Rapid Evaluation Example I
Rapid Evaluation Example II
Activity: Rapid Evaluation and You
Metaevaluations
Developmental Evaluations: Another Alternative to Formative–Summative
Case Study: Developmental Evaluation of Leadership Program
Putting It All Together
Activity: Putting It All Together: What Would You Do?
Summary
Supplemental Resources. W. K. Kellogg Foundation Step-by-Step Guide to Evaluation: How to Become Savvy Evaluation Consumers
Evaluability Assessment Checklist
Descriptions of Images and Figures
Chapter 7 Social Programming, Social Justice, and Evaluation
Introduction
Understanding Social Problems and Social Programs Through a Social Justice Lens. Wicked Problems
Case Study: Global Food Insecurity as a Wicked Problem
Activity: Wicked Problems You See
Social Problems: Definition, Description, and Theoretical Underpinnings
Objective Element of Social Problems
Subjective Element of Social Problems
Sources of Social Problems
Social Problems’ Fluid Nature
Reflect and Discuss: Major Social Problems in Your Community
Social Programs Through a Social Justice and Transformative Lens
Reflect and Discuss: Social Programming and Evaluation Through a Social Justice and Transformative Lens
Voices From the Field
Power, Political, and Economic Nature of Social Problems
Activity: Research the Literature: Who Suffers and Who Benefits
Equity-Based Social Programming
Structural Racism, Social Programming, and Evaluation
Integrating Program Planning and Evaluation Planning
Social Program Evaluations vs. Social Project Evaluations: Distinctions and Implications
Key Program/Project Components Every Evaluator Must Understand
Program Mission
Program Goals
Program Objectives
Activity: Making Objectives SMART
Program Activities
Program Resources
Putting It All Together: Program Mission, Goals, Objectives, Activities, and Resources
Case Study: The Components of a Sample Project
Mission Statement
Goal Statements
SMART Objective Statements
Activities
Resources
Activity: Design Your Own Project
Logic Models: Linking Program Components
Benefits of Logic Models for Program Planning
Limitations of Logic Models
Logic Models and Evaluation Planning
Components of Logic Models
Types and Looks of Logic Models
Nested Logic Models
Activity: Develop Your Own Draft Logic Model
Hypothetical Project
Beyond Traditional Linear Logic Models. Fuzzy Logic Models
Circular Logic Models
Culturally Relevant Logic Models
Afrocentric-Centered Logic Model Approaches
Logic Models From an Indigenous Framework
Summary
Supplemental Resources. Writing SMART Objectives
Readings on Wicked Problems
Logic Model Resources. Enhancing Program Performance With Logic Models
CDC Evaluation Documents, Workbooks, and Tools: Logic Models Series
Developing a Logic Model or Theory of Change
Logic Models: Templates, Examples, and Bibliography
Descriptions of Images and Figures
Chapter 8 Responsive Stakeholder Engagement and Democratization of the Evaluation Process
Introduction
Who Are Stakeholders?
Valuing Stakeholders and Diverse Stakeholder Engagement
Identifying and Classifying the Right Stakeholders
Stakeholder Classifications
Case Study: Classifying Stakeholders
Real Context and Hypothetical Project Description
Activities
Key and Hidden Stakeholders
Reflect and Discuss: Hidden Stakeholders Speak Out
Democratizing the Evaluation Process With Stakeholders
Case Study of Inclusion, Democratic Participation, and Social Justice in Evaluation
Inclusion
Democratic Participation
Social Justice
Relationships, Values, and Stakeholder Engagement
Voices From the Field
Responsive Stakeholder Engagement
The Misuse of Responsive Stakeholder Engagement
Continuum of Stakeholder Engagement: From Nonresponsive to Responsive
Barriers to Responsive Stakeholder Engagement
Case Study: Handling Stakeholder Conflict
The Situation
Activity: Identify Potential Stakeholder Conflict
REACT Description
Benefits of Responsive Stakeholder Engagement
Case Study: Salud America! The Messiness and Value of Stakeholder Engagement
Six-Step Process for Responsive Stakeholder Engagement
Step I: Prepare for Diverse Stakeholder Engagement
Step II: Identify The Full Range of Diverse Stakeholders
Step III: Identify and Prioritize Key Stakeholders
Step IV: Determine Stakeholders’ Values, Needs, Motivations, Interests, And Concerns
Step V: Determine Where Key Stakeholders Fall on the Stakeholder Continuum
Step VI:Create a Process For Ongoing Stakeholder Engagement
Communicating With Stakeholders
Activity: Developing a Stakeholder Engagement Communications Plan
Summary
Supplemental Resources. Identifying and Analyzing Stakeholders and Their Interests
The Value of Engaging Stakeholders in Planning and Implementing Evaluations
The Role of Stakeholders in Educational Evaluation
Stakeholder Involvement in Evaluation: Three Decades of the American Journal of Evaluation
Chapter 9 Planning the Evaluation
Introduction
Dealing With Power Imbalances During Evaluation Planning
Reflect and Discuss: Hidden and Invisible Sources of Power in Our Own Lives
Issues for Reflection and Discussion
Planning for Culturally Responsive and Social Justice–Oriented Evaluations
Voices From the Field
Reflect and Discuss: Hypothetical STEM Project on Indigenous American Reservations
Evaluation Planning Activities
Identifying and Involving Stakeholders in Evaluation Planning
Identifying Stakeholders and Their Potential Role(s)
Involving Stakeholders
Case Study: Talent Development Evaluation Framework: Making Community Connections and Involving Stakeholders in Evaluation Planning
The Case for Planning Urban School Evaluations: A Brief Description
Analysis of the Context
Identifying and Clarifying Project Goals
Reflect and Discuss: Working With Stakeholders to Clarify Project Goals
The Situation
Identifying the Purpose(s) of the Evaluation
Varying Stakeholders’ Perspectives on Evaluation Goals and Priorities
Case Study: Gaining Consensus on Evaluation Purpose Through Engaging Stakeholders
Funder’s Priorities
Project Administrators’ Priorities
Evaluator’s Priorities
Defining Success in Evaluation Planning
The Problem With “Parity” as Success Definition
Beyond Quantitative Definitions of Success
Activity: Defining Success
Identifying Indicators
Understanding Different Types of Indicators
Reflect and Discuss: Social Justice–Related Outcome Indicators
Levels of Indicators
Case Study: Broadening Participation Indicators at Various Levels
Developing Timelines
Identifying Resource Needs
Assembling an Evaluation Team
Identifying the Evaluation Team’s Roles and Responsibilities
Internal vs. External Evaluators
Evaluation Planning and Management Visualization Tools
Gantt Charts
Case Study: Simple Gantt Chart for Planning an Evaluation of a Professional Development (PD) Project
Program Evaluation Review Technique (PERT) Charts
Time and Task Charts or Data Maps
Developing a Written Evaluation Plan
Overcoming Pitfalls in Evaluation Planning
Summary
Supplemental Resources. Community Tool Box
Gantt Project
Project Chart Software for Both Gantt and PERT Charts
Descriptions of Images and Figures
Chapter 10 Evaluation Questions That Matter
Introduction
Reflect and Discuss: Three Critical Functions of Evaluation Questions
Questions for Discussion
Why Evaluation Questions That Matter?
Questions That Matter Meet Information Needs of Diverse Users
Questions That Matter Set the Stage for the Collection of Credible Evidence
Case Study: What Is Credible Evidence?
Questions for Discussion
Power and Privilege Issues in Formulating Evaluation Questions
Characteristics of Good Evaluation Questions: An Overview
Good Questions Align With the Funder’s Requirements
Good Questions Are Useful and Ask About Important Issues
Good Questions Are Tailored and Appropriate to Local Needs
Good Questions Are Clear, Specific, and Well Defined
Good Questions Are Researchable (or Answerable)
Good Questions Are Realistic Considering Contexts and Project Realities
Good Questions Are Reasonable in Number and Scope
Sources of Evaluation Questions
Prioritizing Evaluation Questions for Diverse Audiences
Inclusion/Exclusion Criteria for Prioritizing Evaluation Questions: Two Approaches
U.S. Agency for International Development (USAID) Approach
Centers for Disease Control and Prevention (CDC) Approach
Steps to Identifying, Formulating, and Prioritizing Questions That Matter
Voices From the Field
Activity: Generating and Prioritizing Evaluation Questions
Types of Evaluation Questions
Program Theory Questions
Context Questions
Process Questions
Relevance Questions
Outcomes Questions
Impact Questions
Case Study: FACES Sample Process, Outcomes, and Impact Questions
Sample Process Questions
Sample Outcomes Questions
Sample Impacts Questions
Cost-Benefit and Cost-Effectiveness Questions
Sustainability Questions
Activity: Generating Different Types of Evaluation Questions
Summary of Different Types of Evaluation Questions
Summary
Supplemental Resources. Evaluation Questions Checklist for Program Evaluation
Checklist for Assessing Your Evaluation Questions
Chapter 11 Selecting Appropriate Evaluation Designs
Introduction
Rigor
Bias
Practical Considerations
Theoretical and Cultural Considerations
Reflect and Discuss: What Counts as Evidence?
Voices From the Field
Control and Comparison Groups
Case Study: The Impact of Comparison Groups on Data Interpretation
Ethical Issues
Ethical Use of Control and Comparison Groups
Case Study: Using Control and Comparison Groups
Activity: Justifying Using Control or Comparison Groups
Longitudinal Data
Activity: Using U.S. Census Bureau Data
Activity: Selecting Comparison Data
Evaluation Designs
Experimental Designs
Feasibility of Implementation
Case Study: Opening Doors
Quasi-experimental Designs
Feasibility of Implementation
Pretest/Posttest Designs
Feasibility of Implementation
Retrospective Pretest Designs
Feasibility of Implementation
Case Study: An Accidental Design
Case Studies/Ethnography
Case Study: An Exemplary Example
Feasibility of Implementation
Rival Hypotheses and Threats to Validity
The Best Design for the Question
Activity: Choosing Designs
Summary
Supplemental Resources “How to Solve U.S. Social Problems When Most Rigorous Program Evaluations Find Disappointing Effects”
Research Methods Knowledge Base: Design
Research Ready: Qualitative Approaches
StudentTracker for Outreach
Chapter 12 Defining, Collecting, and Managing Data
Introduction
Qualitative and Quantitative Data
Sources of Qualitative Data
Interviews
Focus Groups
Observations
Participant-Generated Visual Data
Sources of Quantitative Data. Surveys and Other Structured Questionnaires
Records and Other Archival Data
Strengths and Weaknesses of Qualitative and Quantitative Data
Activity: What Data Would You Collect?
Ensuring Data Quality. Validity
Voices From the Field
Types of Validity
Voices From the Field
Reliability. Reliability of Quantitative Data
Reliability of Qualitative Data
Pilot-Testing
Response Rates
Protection of Human Participants
Activity: Exempt, Expedited, Full Review, or Not Research?
Using Existing Measures or Developing New Ones
Sources of Measures
Assessing Existing Measures
Measuring Complex Concepts
Activity: You Can’t Measure Love, or Can You?
Modes of Data Collection
Activity: Making It Better: Improving Data Collection Techniques
Data Management. Mapping Data Collection to Project Goals and Objectives
Activity: Mapping the Data
Timing
Electronic Controls and Data Cleaning
Privacy
Data Management Plans
Summary
Supplemental Resources. Overview: Data Collection and Analysis Methods in Impact Evaluation
People or Systems? To Blame Is Human. The Fix Is to Engineer
Informed Consent in Human Subjects Research
Code of Federal Regulations Title 45 Public Welfare Department of Health and Human Services Part 46 Protection of Human Subjects
Chapter 13 The Best Analysis for the Data
Introduction
Deductive and Inductive Reasoning. An Overview of Deductive and Inductive Reasoning
Case Study: Generating Hypotheses
Using Deductive and Inductive Reasoning
Activity: Using Inductive or Deductive Reasoning
Quantitative Analysis. Levels of Quantitative Data
Activity: Determining Levels of Data
Activity: Are Attitude Scales Ordinal or Interval Data?
Descriptive Statistics
Case Study: Summarizing Data
Inferential Statistics and Statistical Significance
Parametric and Nonparametric Statistics
Case Study: An Accidental Design: The Analysis
Effect Size
Activity: Analyzing Group and Individual Differences
Decision Error and Statistical Power
Hypothesis Testing
Difference-Based and Relationship-Based Analysis
Case Study: A Simple Social Network Analysis
Disaggregating Data
Activity: Who Am I?
Activity: Disaggregating Data
Qualitative Analysis
Voices From the Field
Sources of Qualitative Data
Coding and Codebooks
Activity: Applying Codes and Coding
Activity: Developing and Using Codes
Sample Qualitative Analysis Models
Activity: Following Williams’s Four Steps
Case Study: A Comprehensive Qualitative Coding Process
Summary
Supplemental Resources. Computational Handbook of Statistics (3rd ed.)
Choosing the Correct Statistical Test in SAS, STATA, SPSS, and R
Developing and Using a Codebook for the Analysis of Interview Data: An Example From a Professional Development Research Project
How to Lie With Statistics
Open Coding
Descriptions of Images and Figures
Chapter 14 Reporting, Disseminating, and Utilizing Evaluation Results
Introduction
Reporting Results
The Full Evaluation Report
Other Reporting Mechanisms. Stand-Alone Summaries
A One-Page Bullet Point Summary
A Summary of Conclusions and Recommendations
Case Study: A Summary of Conclusions and Recommendations. Conclusion
Supporting Data
Feedback Reports
Oral Presentations
Sharing of Raw Data
Developing High-Quality, Accessible Reports and Presentations
Readability
Activity: Testing Readability
Activity: Trying Translation Software
Words Matter
Reflect and Discuss: Hot-Button Words
Images Matter
Reflect and Discuss: Who Likes What and Why
Visually Representing Data
Tables
Activity: Using Word and Excel to Make Tables
Reflect and Discuss: Interpreting Different Tables
Figures
Reflect and Discuss: What are Your Chart Preferences?
Reflect and Discuss: Comparing Scatter Plots and Descriptive Statistics
Dashboards
Dissemination. Why Disseminate
Dissemination Plans
Reflect and Discuss: Sharing Information
Using Websites and Social Media
Using Mainstream Media
Keep It Simple
Keep It Interesting
Creative Dissemination Modes
Working With Others
Reflect and Discuss: Effective Dissemination Strategies
Using Evaluation Results
Voices From the Field
Reflect and Discuss: Using Evaluation
Case Study: Evaluation Can Make a Difference
Summary
Supplemental Resources. Tableau Public
Potent Presentations Initiative: p2i Tools and Guidelines
AEA Data Visualization and Reporting: Websites and Tools
Gapminder
Edward Tufte
Descriptions of Images and Figures
Chapter 15 Evaluation as a Business
Introduction
Perspectives on Doing Evaluation as a Business
Voices From the Field
Ethics
Conflicts of Interest
Protection of Human Participants
Nondiscrimination
Cultural Respect
Activity
Business Knowledge and Skills. Marketing
Activity
Case Study
Your Rights and Responsibilities as an Evaluation Client
Activity
Preparing a Proposal
Activity
Making a Budget
Activity
Contracts
Contract Components
Intellectual Property
Making a Business Financially Viable
Selecting a Business Entity
Employee
Sole Proprietor
Partner in a Partnership
Limited Liability Company
Corporation/S Corporation
Nonprofit Organization
Activity
Bookkeeping and Record Keeping
Developing a Business Plan
Activity
Summary
Supplemental Resources. AEA Graduate Student and New Evaluators Topical Interest Group (TIG)
AEA Independent Consulting TIG
Small Business Administration (SBA)
System for Award Management (SAM)
Chapter 16 Interconnections and Practical Implications
Introduction
Objectivity and Bias
Impacts of Bias on Evaluation
Voices From the Field
Acknowledging Subjectivity and Reducing Bias
Activity
Case Study
Reflect and Discuss
Building Cultural Competence
Activity
Personalizing a Social Justice Perspective
Reflect and Discuss
Reflective Practice and Evaluative Thinking
Activity
Applying to Practice
Making Biases Explicit
Activity
Infusing Cultural Responsiveness in the Involvement/Engagement of Stakeholders and the Development of Evaluation Questions
Infusing Cultural Responsiveness in Decisions About Evaluation Designs
Infusing Cultural Responsiveness in Decisions About Data Collection and Analysis. Data Collection
Reflect and Discuss
Data Analysis
Infusing Cultural Responsiveness in Decisions About Reports and Presentations
Social Justice Evaluation
Politics and Evaluation
Voices From the Field: Advice for New Evaluators
Activity
A Final Thought
Supplemental Resources. Evaluative Thinking: Strategies for Reflective Thinking in Your Organization
Practical Strategies for Culturally Competent Evaluation
Beyond Rigor: Improving Evaluations With Diverse Populations
Appendix A: American Evaluation Association Evaluators’ Ethical Guiding Principles. A: Systematic Inquiry: Evaluators conduct data-based inquiries that are thorough, methodical, and contextually relevant
B: Competence: Evaluators provide skilled professional services to stakeholders
C: Integrity: Evaluators behave with honesty and transparency in order to ensure the integrity of the evaluation
D: Respect for People: Evaluators honor the dignity, well-being, and self-worth of individuals and acknowledge the influence of culture within and across groups
E: Common Good and Equity: Evaluators strive to contribute to the common good and advancement of an equitable and just society
Appendix B: Joint Committee on Standards for Educational Evaluation Program Evaluation Standards. Utility Standards
Feasibility Standards
Propriety Standards
Accuracy Standards
Evaluation Accountability Standards
Glossary of Terms
References
Index
Отрывок из книги
Using evaluation in today’s world is ultimately about humanity’s future. The stakes are huge. Evaluation use flows from quality. Quality in today’s world requires diversity. Whether evaluation contributes to a more just, equitable, and sustainable world will be determined by how—and how well—evaluators and stakeholders together meet the challenge to respect, promote, and improve quality, diversity, and use. Our collective future rests on those foundational pillars. This book is an outstanding contribution to the evaluation literature.
—Michael Quinn Patton, Utilization-Focused Evaluation
.....
There can be issues with doing things on faith. Even with the best of intentions, the results of those efforts can be neutral or even negative. For example, one evaluation that I, coauthor Campbell, did found that doing hands-on science activities created by teachers or by after-school leaders caused students to become more stereotyped and limited in their opinions of who could do science. A second program to encourage women to continue on in their engineering programs reinforced rather than overcame stereotypes, with some women in the program coming to feel that it existed because women weren’t as good as men in engineering. As one participant explained: “[Engineering] theory is easier for boys. That is why they put us together [in the special program]” (Campbell & Hoey, 2000, p. 20). No one wants to hear negative outcomes, but problems can’t be fixed unless they are found and acknowledged.
Of great concern is the onset of a so-called post-truth era, complete with alternative facts, disdain for expertise, and a diminishing reliance on facts and analytic thinking in public life (Hamburg, 2019, p. 563). Evaluators and indeed people in general are living and working in a time where terms like alternative facts, post-facts, and post-truth are used regularly. As indicated earlier in the chapter, traditionally alternative facts have been defined as falsehoods or the “opposite of reality.” However, now some are viewing alternative facts as merely a different perspective (Zimmer, 2017). Post-fact and post-truth both refer to an environment in which objective facts are a thing of the past. In a post-fact society, facts are viewed as irrelevant, and emotional appeals are used to influence public opinion. In 2016, the Oxford University Press named post-truth as its word of the year, defining it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief” (para. 2). The idea that “truth no longer mattered. Facts were not just unimportant, but barriers to be smashed through with rhetoric” (Hollo, 2017, para. 3) is increasingly characteristic of today’s world. Evaluators need to explore how such constructs impact evaluation work. “Citizens are increasingly asserting their values, hopes and opinions without apparent interest in finding a shared understanding of the actual state of things. Without such a shared understanding, those values and hopes cannot rationally be expressed and realized. Observers speak of ‘truth decay,’ dismissal of expertise, and neglect of evidence” (Hoit, 2019, p. 433).
.....