MENU

AskIFAS Powered by EDIS

Successful Grantsmanship: Infusing Evaluation Best Practices into Proposals

Glenn Israel, Jaclyn D. Kropp, David C. Diehl, Conner Mullally, and Sebastian Galindo

Introduction

Proposals possessing sound and well-funded evaluation plans are normally stronger and have greater chances of being funded. The evaluation plan—with all its variations—is a critical component in every grant proposal. Notices of Funding Opportunities, Requests for Proposals or Applications (i.e., NOFOs, RFPs, and RFAs, respectively) include detailed instructions of their expectations regarding the inclusion of an evaluation plan in the submitted proposals, as well as the focus and quality they expect to see in those plans. Within the main federal funding agencies targeted by UF/IFAS researchers, several programs include a specialized evaluation review panel as part of the regular review process. The evaluation components are sometimes reviewed first, and proposals that do not meet the expectation for this component are triaged without further review. Given the importance of the evaluation plan, different funding agencies have valuable resources available to enhance the quality of this component for their programs. Some examples include USAID's Evaluation Toolkit (https://usaidlearninglab.org/evaluation-toolkit), the National Science Foundation's (NSF) User-Friendly Handbook for Project Evaluation (https://www.purdue.edu/research/docs/pdf/2010NSFuser-friendlyhandbookforprojectevaluation.pdf), and CDC's Evaluation Resources (https://www.cdc.gov/eval/resources/index.htm). A few of these agencies also have specialized evaluation resources for some of their programs (e.g., NSF's Alliance for Graduate Education and the Professoriate; https://www.nsfagep.org/evaluation-resources/).

This publication shares information we learned during a series of meetings with federal agency program officers and evaluators about best practices for grant proposals. The practices encompass two broad categories: 1) incorporating evaluation expertise into the project team and 2) building a sound project rationale and evaluation plan. By adopting these practices, you will enhance the quality of your proposals; you will most likely increase the amount of extramural funding that is secured; and you will elevate the visibility and impact of programs within your organization.

Part 1: Add evaluation expertise to the project team

One dominant recommendation from program officers and evaluators was to include people with evaluation expertise to collaborate in the development of the project proposal from the start of the process through its completion. Including evaluation expertise on the project team can lead to more clearly articulated program outcomes, refined logic models, and well-designed evaluation instruments and data collection protocols. All too often, evaluators are engaged near the end of a project and must work with the available data, which makes rigorous evaluation difficult, if not impossible. Given this recommendation of program officers and our experience, we offer three guidelines.

Guideline #1: Recruit an evaluation champion to your proposal writing team

It is important to include a person who has expertise or experience in evaluation on the proposal writing team. Doing so helps to ensure that evaluation is seamlessly integrated into the project from the beginning. More importantly, it brings evaluative thinking into team discussions about the formulation of the project design. This person also will be skilled in drafting a logic model or a theory of change as well as in specifying objectives and outcomes (as noted by an Evaluator, USDA Planning, Accountability, and Reporting Staff [USDA-PARS]). In addition, it may be a good time to connect with an external evaluator if one is expected/required (as suggested by a Program officer, NSF). Contacting the UF/IFAS Strategic Collaborative for Engagement and Evaluation (SCENE) can help identify a skilled evaluator that meets your project needs.

Guideline #2: Recruit local talent in project's service area for evaluation work

When a project is being planned for another country or outside the normal operating area of the institution, having people who understand the local culture and possess some technical skills needed for the evaluation can be critical to preventing problems and helping ensure useful data are collected for monitoring and evaluating the project (as mentioned by a program officer at USAID). Partnering with local universities or NGOs is one strategy for recruiting personnel who have the needed skills.

Guideline #3: Include evaluation expertise on the external advisory committee

Many large-scale projects, research centers, and institutes have external advisory committees (EACs). The usual practice is to recruit experts with subject-matter expertise related to their key objectives or project aims. Including a member who has evaluation expertise on the external advisory committee brings additional evaluative thinking to the review and advising process. For example, the University of Florida Clinical and Translational Sciences Institute's EAC has included an evaluator from a similar institute. This external evaluator shares information about practices from the other institute, as well as helps to focus attention on outcomes and impacts.

Part 2: Build a sound project rationale and evaluation plan

Our discussion with agency program officers and evaluators led to six guidelines for writing the project proposal.

Guideline #4: Details matter—follow evaluation guidelines in solicitations

Although this guideline may seem simple, our discussions with agency professionals revealed that many proposals fail to follow all of the requirements in funding solicitations, not only in terms of the budget, but also in regard to all the elements required to complete the body of the proposal. We recommend designating a team member, a grant specialist, or an evaluation champion to be responsible for checking that the proposal meets all the guidelines.

Guideline #5: Draft a logic model to clarify the design for the research, education, and extension/outreach components

Many, but not all, solicitations for proposals require a logic model. There are a few different types of logic models, but the main benefit of creating one is that a model can help clarify thinking about what a project's outcomes are, how a project should operate, and how operations connect with specific outcomes (see Israel, 2001; 2010; Kellogg Foundation, 2004). When a proposal team creates a logic model diagram, the give-and-take discussion can help everyone get on the same page. The discussions can also reveal gaps in the project design or critical assumptions that could threaten the project's intended outcomes. One additional benefit of creating a logic model is that it can succinctly communicate the major components of the project to the proposal's reviewers. The logic model also provides an outline for drafting the proposal narrative and thus should be completed early in the proposal process. Finally, developing a logic model lays the foundation for further elaborating the project's theory of change (Valters, 2014) or program theory (Rossi, Lipsey, & Freeman, 2004; Wholey, Hatry, & Newcomer, 2010) when a funded proposal is translated into actionable steps.

Guideline #6: Emphasize outcomes over outputs

Nearly every agency program officer and evaluator that we met with reported that more attention is needed in identifying and measuring project outcomes, while less effort might be given to outputs. Recall that outputs are the activities conducted and products created by project personnel, while outcomes are the changes made by others outside of the project (e.g., new knowledge by participants in training programs, changes in behaviors, adoption of new technologies, improvements in social, economic, or environmental conditions). While outputs can be measured using project monitoring data, outcomes can only be measured using a credible evaluation design. Solicitation guidelines or the funding agency themselves may provide guidance as to what constitutes a credible design or which approaches are preferred over others.

For example, program personnel from NIOSH told us they wanted more data on behavior change and adoption of regulations or policies (i.e., intermediate outcomes), not just reach and information use. Similarly, those at USAID said there was too much focus on outputs and not enough on outcomes, while USDA-PARS personnel said to show why the project matters (for example, does it unlock new knowledge to address an issue, or does it impact US agriculture and the broader economy?). Finally, program officers at NSF said they want to see a focus on discovery of new knowledge rather than what was to be done.

Guideline #7: Address key metrics of funding agency

Because agencies must be accountable and report back to their stakeholders, proposals that will provide the type of information on the metrics that agencies need are likely to enhance prospects for being funded. For example, USAID is mandated to address gender disparities and environmental impacts. If a proposal to USAID includes plans to address and measure outcomes related to gender disparities and/or environmental conditions, along with other key aspects, it is likely to be reviewed more favorably. Likewise, proposals that emphasize evidence-based practice and comparable effectiveness research align with NIH priorities for its research portfolio, which are then reported to its stakeholders.

Guideline #8: Emphasize data quality and use in monitoring and evaluation plans

Having accurate data is critical to making informed decisions, so designing and implementing a sound monitoring plan can help project directors manage activities. When monitoring information is integrated into day-to-day project procedures, the resulting data help to drive decisions on adjustments that subsequently improve impacts (mentioned by personnel at USAID). In short, collecting monitoring information facilitates adaptive management.

A second aspect of data quality involves the degree of rigor in the design for assessing outcomes. More rigorous designs, such as Randomized Control Treatments (RCTs), are favored by several agencies (USAID, DOE-IES). In addition to including more rigorous designs, having an adequate sample size for the evaluation is important to demonstrating a well-designed evaluation plan. Although program officers and evaluators emphasized the importance of quantitative data for measuring impacts, several at USAID noted that qualitative and mixed methods provided a useful complement, or alternate design, to understanding how and why a project did (or did not) generate specific outcomes.

Guideline #9: Budget adequate resources for evaluation

Credible, rigorous evaluation is not free or easy, so it is important to plan an adequate budget for the work. Program staff at IFAS indicated that 10% of the non-personnel budget was an appropriate amount to allocate for impact evaluation, and the Food for Progress program is one example of having expectations for rigorous evaluation with a minimum of 3% of budget for that component. For complex programs using a contribution analysis methodology to assess impact, 1–2 persons would need to be budgeted for 12 months (per evaluators at NIOSH). In general, we recommend that project budgets should allocate, on average, 10% of the direct costs for the evaluation work; depending upon the funder's guidance and the complexity and intensity of the work, this percentage could fluctuate between 5% and 15%.

Summary

This publication suggests a number of guidelines for preparing successful funding proposals. Broadly speaking, these involve adding evaluation expertise to the project team and building a sound evaluation plan within the project's overall design. The latter includes creating a logic model focused on project outcomes, using rigorous evaluation designs and measurement, and having an adequate budget.

References

Israel, G. D. (2001). Using Logic Models for Program Development. AEC360. Gainesville: University of Florida Institute of Food and Agricultural Sciences. Available at https://edis.ifas.ufl.edu/wc041

Israel, G. D. (2010). Logic Model Basics. WC106. Gainesville: University of Florida Institute of Food and Agricultural Sciences. Available at: https://edis.ifas.ufl.edu/wc106

Kellogg Foundation. (2004). Logic model development guide. Available at: https://wkkf.issuelab.org/resource/logic-model-development-guide.html

Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks, CA: Sage.

Valters, C. (2014). Theories of change in international development: Communication, learning, or accountability? The Justice and Security Research Programme (JSRP) - Paper 17, London: LSE.

Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (2010). Handbook of practical program evaluation (3rd ed.). San Francisco: Jossey-Bass.

Peer Reviewed

Publication #AEC687

Release Date:September 12, 2023

Related Experts

Galindo-Gonzalez, Sebastian

Specialist/SSA/RSA

University of Florida

Diehl, David

Specialist/SSA/RSA

University of Florida

Israel, Glenn D.

Specialist/SSA/RSA

University of Florida

Mullally, Conner

Specialist/SSA/RSA

University of Florida

Kropp, Jaclyn D

Specialist/SSA/RSA

University of Florida

Fact Sheet

About this Publication

This document is AEC687, one of a series of the Department of Agricultural Education and Communication, UF/IFAS Extension. Original publication date January 2020. Revised September 2023. Visit the EDIS website at https://edis.ifas.ufl.edu for the currently supported version of this publication.

About the Authors

Glenn Israel, professor and graduate coordinator, Department of Agicultural Education and Communication; Jaclyn D. Kropp, associate professor, Food and Resource Economics Department; David C. Diehl, associate professor, Department of Family, Youth and Community Sciences; Conner Mullally, assistant professor, Food and Resource Economics Department; and Sebastian Galindo, research assistant professor, Department of Agricultural Education and Communication; UF/IFAS Extension, Gainesville, FL 32611.

Contacts

  • Glenn Israel