Debra Davis, PhD
In my two previous posts, I talked about the first two R’s in the 3 R’s model-RELEVANCE and RESPONSE. Today’s post will focus on the third R—the one with which we seem to have the greatest difficulty—RESULTS. It is the hardest one to write, but also the most critical piece of your impact report. In this section, you will describe the changes that occurred in your participants as a result of their involvement in your program. Specifically, you will want to report on the documented changes in attitudes, knowledge, skills and behavior in your clients. Include the following:
- A description of the evaluation and assessment strategies you used to document the change
- A list of the significant results in order of importance
- A summary of the results/impact
Evaluation Strategies—Answers the question “how do you know this change occurred?” Sample statements might include things like:
□ Pre- and Post-tests were used to assess knowledge changes regarding fruit and vegetable consumption with 608 5th graders who completed the series.
□ The physical activity logs of 587 5th graders were analyzed to determine physical activity patterns.
□ A survey was conducted at field days to gather benchmark data regarding soybean production practices using a 20-item instrument.
□ A retrospective survey approach with 98 participants was used to measure programmatic impact.
□ Ten focus groups were conducted to assess changes in attitude regarding science-related careers in 120 8th grade students.
□ Department of Education data was reviewed to compare scores on standardized tests.
Order of Reporting Results—You may have multiple results to include in your impact report. Order them from the most to least significant. For example, a documented change in behavior would be reported before intent to change. And intent to change would be reported before a change in knowledge, skills or attitudes. Photos and testimonials can also strengthen impact reports. It’s o.k. to include client satisfaction statements, but it’s really not enough anymore. We must go beyond that with our extension programs to actually move clientele from the point where they are satisfied with our programs to the point where they are making behavior changes because of our programs.
In keeping with the examples we used in the previous two blog posts, here are some examples of what RESULTS statements might include:
- 84% of the 608 5th grade students who completed the program tasted at least one new fruit and one new vegetable during the program.
- 559 of 608 (92%) participants increased their number of minutes of physical activity by 30 minutes on 5 of 7 days.
- Approximately three-fourths (76%) of soybean producers surveyed adopted at least two new management practices as a result of the program.
- 61% of 430 program participants scored higher on subsequent standardized tests in science mastery.
Intent to Change Behavior
- Over ½ (58%) of the students in the program indicated they would choose fresh fruit or vegetables over chips for a snack.
- 255 of 274 respondents (93%) indicated they intended to implement at least one new BMP.
- 30% of those who participated in the SET program indicated they would choose a more science-oriented career
Knowledge, Skills and Attitudes
- 492 out of 608 (81%) learned the number of servings of fruits and vegetables they should eat each day.
- 54% of producers indicated that they understood proper spray conditions at a good or excellent level before the program and 93% said they understood proper spray conditions at a good or excellent level after the program.
- 72% of students reported that they were less afraid of science after the SET program than they were before.
Satisfaction (As a reminder, use these sparingly and push yourself and your program to have impacts that far exceed just satisfaction.)
- 100% of the respondents stated they were mostly or completely satisfied with the information being easy to understand.
- 579 of 608 (95.24%) of respondents stated they were mostly or completely satisfied with the helpfulness of the information in making decisions about their own situation.
- 90.91% of respondents stated they were mostly or completely satisfied with the timeliness of information.
- 19 of 22 (86.36%) of respondents stated they were mostly or completely satisfied with the completeness of information given on each topic.
Summarizing your impact report and commenting on future plans in the program area will help to bring the 3R’s together for the reader. For example, a summary and future plans statement of the nutrition program example I have used on childhood obesity might look something like this:
“Results indicate that the Childhood Obesity Program in X Parish met its objectives of increasing the consumption of fruits and vegetables and increasing physical activity in 5th grade students. The greatest behavior changes were associated with trying new fruits and vegetables and becoming more physically active. As we continue to address the childhood obesity issue in this parish, we now know that this program is effective and can be used with other youth to improve overall physical health and reduce the incidence of obesity-related chronic diseases later in life. This program will be expanded next year to include 5th grade students at four additional schools.”
Again, the level of detail you provide on each of these sections will depend on the needs of the audience for which you are writing and the space allowed. For example, some may want a very detailed description of the results including numbers and percentages. Others may be satisfied to know that participants increased or decreased a certain behavior. Knowing which your stakeholders prefer will get easier as you analyze their responses to the impact reports you provide.
In summary, tell the extension story…to anyone who will listen. Using the 3 R’s model—RELEVANCE, RESPONSE, RESULTS—is an easy way to do this.
To learn more about the 3 R’s model and writing impact statements, visit the LSU AgCenter’s Impact Reporting Database or contact ODE.