MENU

AskIFAS Powered by EDIS

Elaborating Program Impacts through Data Analysis

Glenn D. Israel

The analysis of program impacts involves several phases (Israel 2021). Initially, a preliminary data analysis should be conducted to identify errors in the data from coding. The preliminary analysis also should examine the distribution of variables in the data set in order to assess whether or not the data is suitable for the planned statistical methods. In the next phase of the analysis, change in the impact indicators is assessed. This phase of the analysis addresses the questions, "Have the outcome variables changed? Have these variables changed in the expected direction?" Although a negative response to these questions might be interpreted as evidence of the program's failure, other contextual factors may have suppressed or hidden the impact of the program.

The third phase consists of the analyzing changes. Bivariate statistics, such as correlation coefficients and the Chi-square statistic, can be used to answer the question, "Are program variables associated with changes in the outcome variables?" In other words, is participation in a workshop or demonstration related to knowledge gained or use of recommended practices? Then multivariate statistical methods can be used to assess how those associations are affected by contextual factors. For example, are changes in people's behavior influenced by participation in an Extension program after accounting for the effects of another organization's program. The utility of multivariate methods lies in further clarifying the relationship between program variables and oucome variables. This paper illustrates how relationships can change when other variables are included in the analysis. A discussion of how evaluation findings can be affected is also included.

Elaborating Program Impacts

As Patton (1982) notes, one object of the analysis is to clarify the relationship between program activities (variables) and outcome variables. Rosenberg (1968) and others refer to the process of clarifying relationships as elaboration. When a relationship between a program variable and an outcome variable is observed, the evaluator often is interested in determining how that relationship is changed when a contextual variable is introduced. Examination of contextual variables may result in rejection of the initial interpretation of the impact of the program; it may reinforce the initial interpretation, or lead to revision (Rosenberg, 1968). Quantitative multivariate analysis can increase our confidence in the finding of program impact and specify under what conditions that the program works best. These methods also can help to identify ways to improve a program.

The basic method to elaborate on the relationship between the program variable and outcome variable is to stratify by a relevant contextual variable to examine the contingent associations (Rosenberg 1968). This means that the data set is divided into groups (e.g. males and females; low education and high; participants and non-participants in a competing program). Then the relationship between the program variable and the outcome variable is examined for each group.

In addition, specification of the time order of the program and contextual variables is used to further clarify the effect of the program on the target variable. Does the contextual variable occur prior to the program (i.e., is antecedent) or does the contextual variable "intervene" between the program and outcome variables? Four types of relationships are shown below to illustrate how initial interpretations can be affected by the process of elaboration.

Spurious Relationship

In this situation, the impact of the program disappears when a contextual variable is added to the analysis. As shown in Table 1a, the initial relationship shows program participants to be more likely to adopt practice Y than nonparticipants (70% vs. 50% , respectively). When an antecedent contextual variable, education, is used to stratify the association, the relationship between participation and adoption disappears for each level of education (Table 1b). The initial relationship between program participation and adoption of practice Y is due solely to the "marginal" relationship of education with both of the variables. That is, the initial conclusion that participation causes adoption of practice Y must be revised to account for the greater rate of participation of highly educated persons in the program as compared with participation of persons with low education (75% and 25% percent, respectively).

Spurious relationships are perhaps the greatest threat to the validity of evaluation studies. If the effects of relevant contextual variables are not accounted for, then the evaluation risks making conclusions about program impacts which are incorrect. On the other hand, if the relationship between the program variable and outcome variable remains unchanged when a contextual variable is included in the analysis, our confidence that the program's impact is real is increased.

Antecedent Conditional Relationship

In this situation, the effect of the program is revealed to be greater for one group than another when a contextual variable is included. As shown in Table 2a, the initial relationship shows program participants to be more likely to adopt practice Y than nonparticipants (60% vs. 20%, respectively). When education is used to stratify the association, the effect of the program decreases or disappears for clientele with low education and increases for clientele with high education (Table 2b). There is no difference in the percent who adopt practice Y for program participants and nonparticipants in the low education group. At the same time, the difference increases from 40% between participants and nonparticipants in the overall table (Table 2a) to 55.2% for the high education group in the contingent table (Table 2b). Thus, program impact is limited to conditions of high education in this example. In a conditional relationship, at least one of the partial relationships (subtables) shows that the impact of the program on the outcome variable is maintained or strengthened.

Distorter Relationship

In a distorter relationship, the effect of the program reverses (from negative to positive or from positive to negative) when a contextual variable is included in the analysis. The initial relationship in Table 3a shows program participants to be less likely to adopt practice Y than nonparticipants (60% vs. 70%), respectively. When education is added to the analysis, the relationship reverses for clientele with both high education and low education (Table 3b). For each education group in the expanded table, 5% more of program participants adopt practice Y than do nonparticipants. The data in Table 3b shows that there was a difference in the number participating in the program for persons with high and low education. This difference lead to the distortion of program impact seen in Table 3a.

Suppressor Relationship

When the impact of the program on the outcome variable initially appears to be absent, the inclusion of a contextual variable can reveal a positive impact. As shown in Table 4a, there is no difference between program participants and nonparticipants regarding the adoption of practice Y in the initial analysis. When the contextual variable, education, is included, a positive impact of program participation on the adoption of practice Y is found for both levels of education in this example (Table 4b). For each level of education, program participants adopt practice Y by 10 percentage points more than nonparticipants. Again, a difference between the number or persons with high education who participate and those with low education hides the true impact of the program.

Summary

The relationships which are illustrated in the tables in this paper show how the use of contextual variables in the data analysis can guard against erroneous interpretations of program impact. A key step in elaborating relationships is determining which contextual variables need to be included in the analysis. Selection the right ones will increase the credibility of the conclusions about the presence or absence of program impacts.

The primary contribution to the elaboration of evaluation data using quantitative, multivariate techniques is to avoid making incorrect interpretations and to increase confidence in the validity of the findings. Rossi, Lipsey, and Henry (2019) provides an overview of multivariate techniques which can be used in evaluations.

References

Israel, Glenn D. 2015. Phases of Data Analysis. (PEOD-1). Gainesville: University of Florida, Institute of Agricultural Sciences.

Patton, Michael Quinn. 1982. Practical Evaluation. Beverly Hills, CA: Sage Publications

Rosenberg, Morris J. 1968. The Logic of Survey Analysis.

 

New York: Basic Books.Rossi, Peter H., Lipsey, Mark W., and Henry, Gary T.  2019.  Evaluation: A Systematic Approach. 8th ed. Newberry Park, CA: Sage Publications.

 

Table 1a. 

Bivariate relationship showing program impact.

 

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Grand Total

No

50 (50.0%)

50 (50.0%)

100 (100.0%)

Yes

30 (30.0%)

70 (70.0%)

100 (100.0%)

Table 1b. 

Spurious relationship between program participation and adoption, controlling for education.

 

Education

 

High Education

Low Education

Adopt Practice Y

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Total (%)

No (%)

Yes (%)

Total (%)

Grand Total

No

5 (20.0%)

20 (80.0%)

25 (100.0%)

45 (60.0%)

30 (40.0%)

75 (100.0%)

100 (100.0%)

Yes

15 (20.0%)

60 (80.0%)

75 (100.0%)

15 (60.0%)

10 (40.0%)

25 (100.0%)

100 (100.0%)

Table 2a. 

Bivariate relationship indicates a program impact.

 

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Grand Total

No

80 (80.0%)

20 (20.0%)

100 (100.0%)

Yes

40 (40.0%)

60 (60.0%)

100 (100.0%)

Table 2b. 

Conditional relationship between program participation and adoption, controlling for education.

 

Education

 

High Education

Low Education

Adopt Practice Y

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Total (%)

No (%)

Yes (%)

Total (%)

Grand Total

No

70 (87.5%)

10 (12.5%)

80 (100.0%)

10 (50.0%)

10 (50.0%)

20 (100.0%)

100 (100.0%)

Yes

20 (33.3%)

40 (67.7%)

60 (100.0%)

20 (50.0%)

20 (50.0%)

40 (100.0%)

100 (100.0%)

Table 3a. 

Bivariate relationship indicates no program impact.

 

Adopt Practice Y

Program Participant

No (%)

Yes

Grand Total

No

30 (30.0%)

70 (70.0%)

100 (100.0%)

Yes

40 (40.0%)

60 (60.0%)

100 (100.0%)

Table 3b. 

Distorter Relationship between Program Participation and Adoption, Controlling for Education.

 

Education

 

High Education

Low Education

Adopt Practice Y

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Total (%)

No (%)

Yes (%)

Total (%)

Grand Total

No

20 (25.0%)

60 (75.0%)

80 (100.0%)

10 (50.0%)

10 (50.0%)

20 (100.0%)

100 (100.0%)

Yes

4 (20.0%)

16 (80.0%)

20 (100.0%)

36 (45.0%)

44 (55.0%)

80 (100.0%)

100 (100.0%)

Table 4a. 

Bivariate Relationship Indicates No Program Impact.

 

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Grand Total

No

40 (40.0%)

60 (60.0%)

100 (100.0%)

Yes

40 (40.0%)

60 (60.0%)

100 (100.0%)

Table 4b. 

Suppressor Relationship between Program Participation and Adoption, Controlling for Education.

 

Education

 

High Education

Low Education

Adopt Practice Y

Adopt Practice Y

Program Participant

No (%)

Yes (%)

Total (%)

No (%)

Yes (%)

Total (%)

Grand Total

No

18 (30.0%)

42 (70.0%)

60 (100.0%)

22 (55.0%)

18 (45.0%)

40 (100.0%)

100 (100.0%)

Yes

4 (20.0%)

16 (80.0%)

20 (100.0%)

36 (45.0%)

44 (55.0%)

80 (100.0%)

100 (100.0%)

Publication #PEOD3

Release Date:September 13, 2021

Related Experts

Israel, Glenn D.

Specialist/SSA/RSA

University of Florida

  • Critical Issue: Other
Fact Sheet

About this Publication

This document is PEOD3, one of a series of the Agricultural Education and Communication Department, UF/IFAS Extension. Original publication date September 1992. Revised June 2015, June 2018, and June 2021. Visit the EDIS website at https://edis.ifas.ufl.edu for the currently supported version of this publication.

About the Authors

Glenn D. Israel, professor, Department of Agricultural Education and Communication, and Extension Specialist, Program Development and Evaluation Center; UF/IFAS Extension, Gainesville, FL 32611.

Contacts

  • Glenn Israel