Many excellent educational programs are delivered but many times participant behavioral change and potential substantive impact are not evaluated and the results go unreported. This is a huge opportunity missed by many educators. This presentation will explore one approach to evaluating and reporting program impact in terms of participant increased knowledge, their behavioral change, and what I refer to as the “so what” of that behavioral change or program impact. Program impact is the result of participants acting on or putting into effect their new knowledge. As described by Claude F. Bennett in his program evaluative work, there is a seven step hierarchy for program evaluation. Each step increases the complexity of evaluative information gathered. The very last step in the hierarchy is program impact. Also discussed will be approaches to gathering both “hard” evaluative and “soft” evaluative results. Hard evaluative results includes such things as increased income, changes in profit or loss, changes in indexes such as a pollution index, etc. Soft evaluative results includes such things as casual perceptions of change such as an increase in quality of life, environmental quality, etc. Lastly, good program impact evaluation can be beneficial if reported in a creative manner. Reporting substantive impact results can aid educators in garnering grant funds, rallying political or collaborative support for a program effort or organization, and informing the public of the quality educational programming being done by you, your team, and your organization.
|Conference||2008 National Women in Agriculture Educators Conference|
|Presentation Type||30-Minute Concurrent|