Evaluation does not end with the writing of a report. To benefit from the evaluation, it is important that its findings are put to use! This section provides suggestions for how you can use evaluation results and provides advice for how to encourage others to help implement desired program changes.
Table of Contents
Related Topics: Participatory Evaluation
The Foellinger-Freimann Botanical Conservatory offers field trip programs to local schools.
After noticing that many school groups came unprepared for their field trip, the Conservatory decided to evaluate the effectiveness of the pre and post visit materials that it offers teachers. Through surveys of local teachers and school administrators, the Conservatory learned that teachers did not have time to look over the pre-visit materials and that many teachers and administrators were unaware of the Conservatory’s range of programs.
Based on the evaluation, the Conservatory created pre-visit orientation videos for students and teachers, which were shared with school district libraries. Any teacher planning a field trip was also required to take a personal tour of the Conservatory prior to his or her classroom’s visit. Additionally, the Conservatory gave presentations at school district meetings to increase administrators' awareness of its programs. These changes, among others, resulted in smoother field trip experiences, garnered administrative support for the programs, and fostered positive reaction to the Conservatory’s offerings.
Source: Nate Meyer, Regional Extension Educator, Environmental Science Education, University of Minnesota Extension
The Chesapeake Bay Foundation (CBF) works cooperatively with government, business, and citizens to protect and restore the Chesapeake Bay. CBF seeks to reduce pollution, restore habitat and replenish fish stocks, and educate and engage constituents to take action for the Bay.
In 2001, CBF underwent a comprehensive external evaluation of their environmental education programs aimed at K-12 teachers and students. This comprehensive evaluation allowed CBF to examine the scope of twenty-five years of programming, specify what program outcomes to measure and determine if desired accomplishments occurred. Results of the evaluation allowed CBF to match their expected outcomes with program design components, compare specific components of their overall program, and determine what design changes could increase program effectiveness.
Results revealed that some CBF education programs were not working synergistically to create the greatest impact:
The Wonders In Nature - Wonders In Neighborhoods (W.I.N.-W.I.N.) program was developed by Denver Zoo and the Colorado Division of Wildlife. Together with the support of more than thirty partners provide EE programs for urban students in the Denver metro area through classroom visits, pre and post-visit activity curricula, field trips, and family days.
In 2000-2001, a W.I.N.-W.I.N. evaluation system was developed under the guidance of a team of external evaluators. The data gathered from this stake-holder based evaluation suggested that the program was successful overall. However, one specific finding was that there was frustration among participants and partners with the logistics of registering for and coordinating programs.
In light of these results, W.I.N.-W.I.N. switched from a paper-based registration process to an electronic database. The new system provided automatic coordination among partners, schools and the bus company used for field trips, thereby significantly improving communication and reducing registration and scheduling errors.
Source: Chasta Beals, W.I.N.-W.I.N. Logistics Specialist
Denver Zoo’s W.I.N.-W.I.N program
The Environmental Learning Center in Florida teaches themed lessons to 1st, 3rd, and 4th graders. For example, the 4th grade “Lagoon Days” program uses volunteer instructors to lead students through 6 learning stations that help students learn concepts including benthic ecology, seining, Florida history, canoeing, adaptations, and birding.
The Lagoon Days program has been extensively evaluated using a pre/post questionnaire for students, a teacher survey, and a volunteer survey. One major finding was that the program was only moderately successful in developing understanding of key concepts that correspond to education benchmarks.
ELC has used the pre/post student results to improve the program, with special emphasis on subjects that meet key education benchmarks. The evaluation results have also informed the center’s volunteer training. Efforts have focused on increasing volunteers’ understanding of the concepts and creating key concept teaching aids for the volunteers to use.
Source: Heather Stapleton, Education Coordinator
The Missouri Environmental Education Association (MEEA) works to advance EE in Missouri through professional development and networking opportunities. The Environmental Education and Training Partnership (EETAP) serves as a leader in delivering environmental education training to education professionals. Grant funds from EETAP allowed MEEA to develop a publicly available, statewide EE resource database of individuals, institutions, and businesses/industry interested in enhancing education and the environment.
To evaluate the MEEA Resource Database, subscribers were surveyed regarding their knowledge of and attitudes towards the database; and their intentions to use it as a means to advertise their resources, programs, and services. While 80% of subscribers agreed or strongly agreed that the database was important for promoting their resources, MEEA felt that there was still more to be learned, and that additional information could help MEEA refine the database to make it more useful for both current and potential subscribers.
As part of the resource database evaluation, MEEA made several recommendations to both improve the database and inform future evaluations. These include:
Sharing the results of your evaluation can help other environmental educators and program managers design not only more effective educational experiences but also better evaluations of their own programs. You can offer to share your evaluation reports with other MEERA users by submitting them to Dr. Michaela Zint.
…into how you have used EE program evaluations. Please forward to Dr. Michaela Zint.
If your evaluation process was participatory, make sure you continue to involve stakeholders. For example, they should be involved in the development of a plan for implementing the evaluation’s recommendations (see Tip). If the process was not participatory, it is not too late to try to involve others. You are likely to need their support.
At this point, the usability of your evaluation depends on the evaluation questions you chose and the extent to which you selected a participatory approach. Even now, however, there are things you can do to help ensure that the evaluation will benefit your program:
Finalize and distribute your report promptly. This way you will not miss opportunities to influence important decisions. If you delay distribution, your findings may no longer be relevant when they are shared. There are some exceptions, that is, times when you may want to wait. For example, it may make sense to share your report just before an annual planning meeting or other important decision-making meeting, when participants will have a concrete reason to pay close attention to the findings (RBFF, 2006).
Be strategic about how you share your results. Directly communicate your findings to those who you want to use the information and do so in ways that will appeal to them. For example, rather than disseminating an evaluation report and hoping it will be read, develop tailored presentations of your results for specific individuals or groups. Provide information that is most relevant to stakeholders’ priorities. Suggest ways that you plan to address their priorities and include specific actions they can take to help implement the recommendations.
Follow up! After sharing your report or recommended changes with intended users, make sure they have a chance to discuss to what extent and how to best implement them. One way you can help ensure the recommendations are acted upon is by coming up with an implementation plan (see Tip) and indicating how you will help to carry out the plan.
All of the above will help to ensure that your evaluation will be noticed and that recommendations are acted on. Here are some additional ideas to think about to get the most from your evaluation:
Work with stakeholders to develop a plan for implementing the evaluation’s recommendations. This plan should include a prioritized list of actions. Identify who is responsible for implementing the actions and monitoring that they take place within a specified time period. Identify what barriers may be encountered and ideas for how to overcome these. In addition, track what changes occur and how these occur, both provide further evidence of success.
Evaluate the evaluation! At this stage of the process, take some time to reflect on the evaluation. What went well? What would you do differently if you could do it over? Specifically, what would you do differently to help ensure that recommendations will be acted on? Reflecting on the evaluation and its influence will help to improve future evaluations. For a description of things to consider when evaluating the evaluation, refer to:
Document changes to the organization that may have occurred as a result of the evaluation. Not only can the evaluation’s recommendations, when acted on, help to improve your program but so can the process of conducting the evaluation A positive evaluation experience can stimulate improvements in organizational culture, teamwork, and relationships. The evaluation process typically increases participants’ understanding of the program, and increases their motivation to help the program succeed. The capacity-building effects of the process can lead participants to develop lasting skills and habits in critical thinking, problem-solving, leadership, and in the practice of evaluation itself. Documenting benefits to staff and the organization as a whole will help to justify further investments in evaluation.
If you want to learn more about how to use evaluations and how to ensure the use of evaluations, review:
Pages 86-87 in Chapter 5 “Create Useful Results from Evaluation Data” describes how evaluations can be used and makes the case for monitoring how evaluations are used. Factors that influence the use of evaluations are also described.
Measuring Progress: An Evaluation Guide for Ecosystem and Community-Based Projects (.pdf)
Ecosystem Management Initiative, University of Michigan, 2004.
Stage D, “How Will You Use the Information in Decision-Making?” explains how to use the evaluation to create an action plan for improving your program. The proposed approach is based on the use of “trigger points”– predetermined values of an indicator that signify the need to take action. This section guides you through the process of establishing trigger points for indicators, determining actions to take when trigger points are reached, and specifying who will be responsible for taking these actions.
Worthen, B. R., Sanders, J. R., & Fitzpatick, J. L (1997). Program evaluation: Alternative approaches and practical guidelines. New York: Longman.
Kellogg Foundation. (2006). Using Evaluation Findings. Downloaded September 25, 2006 from: www.wkkf.org/Default.aspx?tabid=90&CID=281&ItemID=
EMI (Ecosystem Management Initiative). (2004). Measuring Progress: An Evaluation Guide for Ecosystem and Community-Based Projects. School of Natural Resources and Environment, University of Michigan. Downloaded September 20, 2006 from: www.snre.umich.edu/ecomgt/evaluation/templates.htm