Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to site content Skip directly to page options

Evaluation Phases and Processes

The program evaluation process goes through four phases — planning, implementation, completion, and dissemination and reporting — that complement the phases of program development and implementation. Each phase has unique issues, methods, and procedures. In this section, each of the four phases is discussed.

Planning

The relevant questions during evaluation planning and implementation involve determining the feasibility of the evaluation, identifying stakeholders, and specifying short- and long-term goals. For example, does the program have the clarity of objectives or transparency in its methods required for evaluation? What criteria were used to determine the need for the program? Questions asked during evaluation planning also should consider the program’s conceptual framework or underpinnings. For example, does a proposed community-engaged research program draw on “best practices” of other programs, including the characteristics of successful researcher-community partnerships? Is the program gathering information to ensure that it works in the current community context?

Defining and identifying stakeholders is a significant component of the planning stage. Stakeholders are people or organizations that have an interest in or could be affected by the program evaluation. They can be people who are involved in program operations, people who are served or affected by the program, or the primary users of the evaluation. The inclusion of stakeholders in an evaluation not only helps build support for the evaluation but also increases its credibility, provides a participatory approach, and supplies the multiple perspectives of participants and partners (Rossi et al., 2004).

Stakeholders might include community residents, businesses, community-based organizations, schools, policy makers, legislators, politicians, educators, researchers, media, and the public. For example, in the evaluation of a program to increase access to healthy food choices in and near schools, stakeholders could include store merchants, school boards, zoning commissions, parents, and students. Stakeholders constitute an important resource for identifying the questions a program evaluation should consider, selecting the methodology to be used, identifying data sources, interpreting findings, and implementing recommendations (CDC, 1999).

Once stakeholders are identified, a strategy must be created to engage them in all stages of the evaluation. Ideally, this engagement takes place from the beginning of the project or program or, at least, the beginning of the evaluation. The stakeholders should know that they are an important part of the evaluation and will be consulted on an ongoing basis throughout its development and implementation. The relationship between the stakeholders and the evaluators should involve two-way communication, and stakeholders should be comfortable initiating ideas and suggestions. One strategy to engage stakeholders in community programs and evaluations is to establish a community advisory board to oversee programs and evaluation activities in the community. This structure can be established as a resource to draw upon for multiple projects and activities that involve community engagement.

An important consideration when engaging stakeholders in an evaluation, beginning with its planning, is the need to understand and embrace cultural diversity. Recognizing diversity can improve the evaluation and ensure that important constructs and concepts are measured.

Top of Page

Implementation — Formative and Process Evaluation

Evaluation during a program’s implementation may examine whether the program is successfully recruiting and retaining its intended participants, using training materials that meet standards for accuracy and clarity, maintaining its projected timelines, coordinating efficiently with other ongoing programs and activities, and meeting applicable legal standards. Evaluation during program implementation could be used to inform mid-course corrections to program implementation (formative evaluation) or to shed light on implementation processes (process evaluation).

For community-engaged initiatives, formative and process evaluation can include evaluation of the process by which partnerships are created and maintained and ultimately succeed in functioning.

Top of Page

Completion — Summative, Outcome, and Impact Evaluation

Following completion of the program, evaluation may examine its immediate outcomes or long-term impact or summarize its overall performance, including, for example, its efficiency and sustainability. A program’s outcome can be defined as “the state of the target population or the social conditions that a program is expected to have changed,” (Rossi et al., 2004, p. 204). For example, control of blood glucose was an appropriate program outcome when the efficacy of empowerment-based education of diabetes patients was evaluated (Anderson et al., 2009). In contrast, the number of people who received the empowerment education or any program service would not be considered a program outcome unless participation in and of itself represented a change in behavior or attitude (e.g., participating in a program to treat substance abuse). Similarly, the number of elderly housebound people receiving meals would not be considered a program outcome, but the nutritional benefits of the meals actually consumed for the health of the elderly, as well as improvements in their perceived quality of life, would be appropriate program outcomes (Rossi et al., 2004). Program evaluation also can determine the extent to which a change in an outcome can be attributed to the program. If a partnership is being evaluated, the contributions of that partnership to program outcomes may also be part of the evaluation. The CBPR model presented in Chapter 1 is an example of a model that could be used in evaluating both the process and outcomes of partnership.

Once the positive outcome of a program is confirmed, subsequent program evaluation may examine the long-term impact the program hopes to have. For example, the outcome of a program designed to increase the skills and retention of health care workers in a medically underserved area would not be represented by the number of providers who participated in the training program, but it could be represented by the proportion of health care workers who stay for one year. Reduction in maternal mortality might constitute the long-term impact that such a program would hope to effect (Mullan, 2009).

Top of Page

Dissemination and Reporting

To ensure that the dissemination and reporting of results to all appropriate audiences is accomplished in a comprehensive and systematic manner, one needs to develop a dissemination plan during the planning stage of the evaluation. This plan should include guidelines on who will present results, which audiences will receive the results, and who will be included as a coauthor on manuscripts and presentations.

Dissemination of the results of the evaluation requires adequate resources, such as people, time, and money. Finding time to write papers and make presentations may be difficult for community members who have other commitments (Parker et al., 2005). In addition, academics may not be rewarded for nonscientific presentations and may thus be hesitant to spend time on such activities. Additional resources may be needed for the translation of materials to ensure that they are culturally appropriate.

Although the content and format of reporting may vary depending on the audience, the emphasis should be on full disclosure and a balanced assessment so that results can be used to strengthen the program. Dissemination of results may also be used for building capacity among stakeholders.

Top of Page

 
Contact Us:
  • Agency for Toxic Substances and Disease Registry
    4770 Buford Hwy NE
    Atlanta, GA 30341
  • 800-CDC-INFO
    (800-232-4636)
    TTY: (888) 232-6348
    Contact CDC-INFO
  • New Hours of Operation
    8am-8pm ET/Monday-Friday
    Closed Holidays
USA.gov: The U.S. Government's Official Web PortalDepartment of Health and Human Services
Agency for Toxic Substances and Disease Registry, 4770 Buford Hwy NE, Atlanta, GA 30341
Contact CDC: 800-232-4636 / TTY: 888-232-6348

A-Z Index

  1. A
  2. B
  3. C
  4. D
  5. E
  6. F
  7. G
  8. H
  9. I
  10. J
  11. K
  12. L
  13. M
  14. N
  15. O
  16. P
  17. Q
  18. R
  19. S
  20. T
  21. U
  22. V
  23. W
  24. X
  25. Y
  26. Z
  27. #