Appendix G
Shortcut Navigation:
Change Text Size: A A A

Appendix G

SHARE: 
Print

Recommended Resources

Table of Contents • Introduction • What Do I Need to Get Started? •  Falls Free® Logic Model  • Standard Set of Survey Questions • Next Steps • Appendix 

The following online, easy to access resources may be useful in designing an evaluation process.

  • Centers for Disease Control and Prevention, National Center for Injury Prevention and Control, Thompson, N. & McClintock, H. Demonstrating Your Program’s Worth: A Primer on Evaluation for Programs to Prevent Unintentional Injury. (reprint 2000)
    Authors show why evaluation is worth the resources and effort involved and how to conduct simple evaluation, how to hire and supervise consultants for complex evaluation, and how to incorporate evaluation activities into the activities of the injury prevention program itself. By learning to merge evaluation and program activities, managers will find that evaluation does not take as much time, effort, or money as they expected.
    Available online at http://www.cdc.gov/ncipc. [Accessed online November 17, 2011 at http://www.cdc.gov/ncipc/pub-res/dypw/Index.htm.]

 

  • Centers for Disease Control and Prevention, MMWR. Framework for Program Evaluation in Public Health. (1999)
    CDC developed the framework for program evaluation to ensure that amidst the complex transition in public health, we will remain accountable and committed to achieving measurable health outcomes.
    Available online at http://www.cdc.gov/mmwr. [Accessed online November 17, 2011 at http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm.]

 

  • Centers for Disease Control and Prevention, Evaluation Team, (2008). Data Collection Methods for Program Evaluation: Focus Groups. 
    This CDC Evaluation Brief is about focus groups as a data collection method for evaluation. This brief includes a basic overview of focus groups; when to use them; how to plan and conduct them; and their advantages and disadvantages.
    Available online at http://www.cdc.gov. [Accessed online November 17, 2011 at http://www.cdc.gov/healthyyouth/evaluation/pdf/brief13.pdf]

 

  • Centers for Disease Control and Prevention, Office of the Director, Office of Strategy and Innovation. Introduction to Program Evaluation for Public Health Programs. (2006)
    This is a how-to guide for planning and implementing evaluation activities based on CDC’s Framework for Program Evaluation in Public Health, and is intended to assist state, local, and community managers and staff of public health programs in planning, designing, implementing, and using the results of comprehensive evaluations in a practical way. This strategy will help assure that evaluations meet the diverse needs of internal and external stakeholders, including assessing and documenting program implementation, outcomes, efficiency, and cost-effectiveness of activities, and taking action based on evaluation results to increase the impact of programs. This version has been customized by the National Center for Infectious Disease – with examples of appropriate antibiotic use programs – but the principles remain universal.
    Available online at http://www.cdc.gov. [Accessed online November 17, 2011 from http://www.cdc.gov/getsmart/program-planner/downloads/Manual_04062006.pdf

 

  • Fall Prevention Center of Excellence. Alkema, G. & Liebig, P. Evaluation Basics for Fall Prevention Coalitions and Programs.  The purpose of this Technical Issue Brief #5 is to provide an overview of evaluation and describe specific elements necessary for completing an effective evaluation report.  Available online at http://www.stopfalls.org. [Accessed online November 17, 2011 at http://www.stopfalls.org/grantees_info/files/Brief5-Alkema.pdf]

 

  • Family Health International. Qualitative Research Methods: A Data Collector's Field Guide (2005)
    This how-to guide covers the mechanics of data collection for applied qualitative research. It is appropriate for novice and experienced researchers alike. It can be used as both a training tool and a daily reference manual for field team members. Available online at http://www.fhi.org. [Accessed online November 17, 2011 from http://www.fhi360.org/en/rh/pubs/booksreports/qrm_datacoll.htm ]

 

  • Free Management Library. Basic Guide to Program Evaluation. 
    This document provides guidance toward planning and implementing an evaluation process for for-profit or nonprofit programs -- many kinds of evaluations can be applied to programs, e.g., goals-based, process-based and outcomes-based.
    Accessed online November 17, 2011 from http://managementhelp.org/evaluation/program-evaluation-guide.htm

 

  • National Council on Aging/Falls Free®. Falls and Fall Related Injuries among Older Adults: A Practical Guide to State Coalition Building to Address a Growing Public Health Issue. Step Nine: Evaluating the Coalition and its Activities. (2007)
    This full report is an online compendium of tools, resources, strategies and lessons learned in effectively creating and sustaining statewide coalitions to address falls and fall-related injuries and deaths among older adults in nine steps. The focus of Step Nine is evaluation. “Evaluation – the systematic investigation of the merit, worth, or significance of an effort – is an important step in providing feedback about the progress of the coalition.”
    Available online at http://www.coalitions.fallsfree.org. [Accessed online November 17, 2011 from http://www.coalitions.fallsfree.org/index.cfm?pageId=406&page=4]

 

  • National Science Foundation. User Friendly Handbook for Project Evaluation. (2002)
    The Handbook was developed to provide a basic guide for the evaluation of educational programs. It is aimed at people who are learning what evaluation can do and how to do an evaluation. The Handbook discusses quantitative and qualitative evaluation methods, suggesting ways in which they can be used as complements in an evaluation strategy. Program program managers will have an increased understanding of the evaluation process and gain knowledge to help communicate with evaluators and manage an actual evaluation
    Available online at http://www.nsf.gov. [Accessed online November 17, 2011 from http://www.nsf.gov/pubs/2002/nsf02057/nsf02057.pdf]

 

 

  • Robert Wood Johnson Foundation Evaluation Series, by Ottoson J and Martinez D. An Ecological Understanding of Evaluation Use: A Case Study of the Active for Life Evaluation. (2010)
    This case study illustrates the parameters of use (e.g., for whom? when? how?), as well as what constituted the use of the evaluation of the Robert Wood Johnson Foundation (RWJF) program Active for Life®: Increasing Physical Activity Levels in Adults Age 50 and Older. RWJF commissioned the University of South Carolina Prevention Research Center to conduct an external process and outcome evaluation of Active for Life from 2003 to 2007.
    Available online at http://www.rwjf.org. [Accessed online November 17, 2011 from http://www.rwjf.org/pr/product.jsp?id=71148&cid=xtw_rwjf]

 

  • San Diego City College Government and Military Education Department. Kirkpatrick Levels of Evaluation. (no date)
    Overview of Donald L Kirkpatrick’s learning and training evaluation theory. Kirkpatrick’s theory has become a widely used and popular model for the evaluation of training and learning. The four levels of the Kirkpatrick evaluation model essentially measure:
    • Level 1 – Reaction: How those who participate in the program react to it? Also called customer satisfaction.
    • Level 2 – Learning: The extent to which participants change attitudes, improve knowledge, and/or increase skill.
    • Level 3 – Behavior: The extent to which change in behavior has occurred because the participant attended the training program.
    • Level 4 – Results: The final results that occurred because the participants attended the program.
    Available online at http://www.mysdcc.sdccd.edu. [Accessed online November 17, 2011 at http://www.mysdcc.sdccd.edu/Staff/Instructor_Development/Content/HTML/Donald_Kirkpatrick_Page1.htm]

 

  • University of Kansas. The Community Tool Box. (2011)
    The Community Tool Box is a service of the Work Group for Community Health and Development at the University of Kansas. The Tool Box promotes community health and development by connecting people, ideas and resources. In How-to Guidance (“Table of Contents”) there are 46 chapters of practical, step-by-step guidance in community-building skills. The Tool Box also contains a series of Toolkits (“Do the Work”) to get a quick start in doing key activities in community work. In there are 16 topics that include Creating and Maintaining Coalitions and Partnerships and Partnerships Evaluating the Initiative. 
    Available online at http://ctb.ku.edu. [Accessed online November 17, 2011:
    How-to “Table of Contents” from http://ctb.ku.edu/en/tablecontents/index.aspx;
    Toolkits “Do the work.” Creating and Maintaining Coalitions and Partnerships from http://ctb.ku.edu/en/dothework/httpctb2.1boc.netendotheworktools_tk_17.aspx and
    Partnerships Evaluating the Initiative http://ctb.ku.edu/en/dothework/tools_tk_12.aspx]

 

  • University of Kentucky – College of Agriculture. Rennekamp, R. A. & Nall, M. A. Using Focus Groups in Program Development and Evaluation. (2010)
    This document provides a seven page overview of focus groups and a one page checklist for conducting focus group interviews.
    Available online at http://www.ca.uky.edu. [Accessed online November 17, 2011 from http://www.ca.uky.edu/agpsd/focus.pdf]

 

  • W.K. Kellogg Foundation. W.K. Kellogg Foundation Guides Evaluation Handbook. (1998)
    This handbook is guided by the belief that evaluation should be supportive and responsive to projects, rather than become an end in itself. It provides a framework for thinking about evaluation as a relevant and useful program tool. It is written primarily for project directors.
    Available online at http://www.wkkf.org. [Accessed online November 17, 2011 at http://www.ojp.usdoj.gov/BJA/evaluation/links/WK-Kellogg-Foundation.pdf]

 

 

‹‹ Back   Next ››