University of FloridaSolutions for Your Life

Download PDF
Publication #AEC394

The Savvy Survey #4: Details in the Design1

Glenn D. Israel and Jessica L. Gouldthorpe2

As part of the Savvy Survey Series, this publication provides an overview of important facets of the survey process. Topics covered include modes for collecting responses, strategies for contacting clients and personalizing contacts, and tips for using incentives. The ability of a survey to gather accurate and useful information for assessing program needs or evaluating program outcomes is greatly influenced by the survey’s design. Careful attention to detail is essential.

Modes for Collecting Responses

With the expansion of web access and the development of new communication technologies, Extension faculty now have more options than ever before for conducting surveys. Many types of surveys exist and can be conducted in a variety of situations. There are two basic types of surveys: interviewer-administered and self-administered.

Interviewer-Administered Surveys

Interviewer-administered surveys include:

  • Face-to-face interviews with clients to collect in-depth evaluation information or with key informants to identify local needs and assets

  • Telephone interviews with clients or a sample of a population

Interviewer-administered surveys give the interviewer the advantage of being able to clarify questions and check the respondents’ understanding. On the other hand, interviewers can intentionally or unintentionally influence respondents and introduce bias in the collected data (see Loosveldt 2009 for more information).

Self-Administered Surveys

Self-administered surveys include:

  • Paper and pen questionnaires distributed to clients attending a program to obtain feedback and customer satisfaction

  • Paper and pen questionnaires mailed to a sample of a population in a county, region, or state (see, for example, Gaul, Hochmuth, Israel, and Treadwell 2009; Israel, Easton, and Knox 1999)

  • Online surveys to capture information on knowledge, attitudes or behaviors (see Lamm, Israel, and Diehl 2013)

  • Mixed-mode surveys using email and mail to contact clients about knowledge, attitudes, or behaviors (see Israel 2010; 2011; 2013)

Self-administered surveys cost less than interviewer-administered surveys because of the reduced time and labor. In addition, web and mixed-mode surveys can be less expensive than mail surveys because of avoided postage costs. On the other hand, some clients do not have access to the web or choose not to use it; therefore, web-only surveys should be avoided unless there is universal access (see Israel 2010). For follow-up surveys with clients, universal access would be indicated by 90–95% having provided an email address. As discussed in The Savvy Survey #1: Introduction, coverage error becomes a bigger concern as an increasingly larger proportion of the target audience is excluded from the survey.

Self-administered surveys rely on the ability of respondents to interpret questions correctly without the help of an interviewer, which makes it especially important to construct these surveys carefully. Detailed information about writing items for a questionnaire and formatting a questionnaire to aid in respondent completion is available in publications #6a–e and #7 of The Savvy Survey series. Furthermore, additional considerations for using the various survey modes can be found in the following Savvy Survey publications:

  • The Savvy Survey #10: In-person and Group-administered Surveys

  • The Savvy Survey #11: Mail-based Surveys

  • The Savvy Survey #12: Telephone Surveys

  • The Savvy Survey #13: Online Surveys

  • The Savvy Survey #14: Mixed-mode Surveys

Contact Procedures

One of the principles for conducting a high-quality survey and getting useful data is to make multiple contacts. For a telephone survey, it is common to make 6 or 8 calls to the same number, and some important surveys (such as those conducted by the Bureau of Labor Statistics) make 30 or more calls in attempting to complete a survey (Sangster and Meekins 2004). Mail and mixed-mode surveys typically use 4 or 5 contacts (see Dillman, Smyth, and Christian 2009). Regardless of which survey mode is selected, multiple contacts have been shown to increase the number of completed questionnaires.

For each survey mode, the contact procedures are tailored to the situation, as recommended by the Tailored Design Survey Method (Dillman et al. 2009). Table 1 shows a typical contact procedure by survey mode. For face-to-face interviews, contact to set up an interview is by telephone call (typically one or two calls). In comparison, a telephone survey will start with an initial call and then include 5 or more calls at different days/times to reach those who did not complete the survey.

For mail surveys, a pre-letter is often used to alert potential respondents to the survey, followed by the survey packet a few days later. Online surveys often include the link to the survey in the initial contact because of the ephemeral nature of email messages. Because the lifespan of email messages is short, reminders for online surveys should be sent more frequently than those for mail surveys (e.g., after 4 or 5 days as opposed to a couple weeks). Finally, the most effective mixed-mode procedures emphasize responding via the web first and then by mail (Israel 2013; Messer and Dillman 2011; Millar and Dillman 2011).

Personalizing the Survey Process

Research shows that personalization has a small but significant effect on increasing response rates (Dillman et al. 2009). Personalization helps to connect the respondent to the survey. Personalize for each respondent by:

  • Using the client’s name in contact messages (e.g., Dear Joe Client), or using a group name with which clients identify (e.g., Dear Jackson County Cattlemen)

  • Individually signing letters in blue ink for postal contacts (which shows the importance of the survey to the respondent)

  • Using logos, pictures, or graphics on the questionnaire that are tailored to the targeted group to increase the salience of the survey

Adding individual names and other information to contact messages is fairly easy and fast. Spreadsheet files containing names and addresses can be used with word processing applications to do a mail merge to produce a personalized letter or email message for a client (See The Savvy Survey #11: Mail-based Surveys and The Savvy Survey #13: Online Surveys for examples).

Incentives

Besides the number of contacts, incentives have been shown to increase the number of completed surveys more than any other tactic. Of these, monetary incentives of $1 to $5 delivered with the request to complete the questionnaire are the most effective because they build trust and invoke a norm of reciprocity (Dillman et al. 2009). Unfortunately, monetary incentives are often unavailable or impractical for Extension faculty. When they are feasible, monetary incentives can increase response rates between 9 and 20% for surveys of Extension audiences (Israel, Wilson, and Haller 2013; Wilcox, Guilliano, and Israel 2010). However, nonmonetary incentives have also been used (such as bookmarks, pamphlets, and fact sheets), but these tokens of appreciation are less effective.

In Summary

This publication in the Savvy Survey Series has introduced modes for collecting responses to a survey, described the importance of tailoring the contact process to the survey situation while making multiple contacts to maximize the number of completed surveys, and reviewed techniques for personalizing a survey. It also provided information about whether to use an incentive with the survey. Attending to these details of the survey process can significantly impact the amount and quality of data collected and, in turn, its usefulness in assessing needs or evaluating program outcomes.

References

Dillman, D. A., J. D. Smyth, and L. M. Christian. 2009. Internet, mail, and mixed-mode surveys: The tailored design method. (3rd ed.). Hoboken, NJ: John Wiley and Sons.

Gaul, S. A., R. C. Hochmuth, G. D. Israel, and D. Treadwell. 2009. Characteristics of small farm operators in Florida: Economics, demographics, and preferred information channels and sources. WC088, 7 pp. Gainesville: University of Florida Institute of Food and Agricultural Sciences. Available at: http://edis.ifas.ufl.edu/wc088.

Israel, G. D., J. O. Easton, and G. W. Knox. 1999. Adoption of landscape management practices by Florida citizens. HortTechnology, 9(2), 262–266.

Israel, G. D. 2010. Using Web-hosted surveys to obtain responses from Extension clients: A cautionary tale. Journal of Extension [on-line], 48(4), Article 4FEA8. Available at: http://www.joe.org/joe/2010august/a8.php.

Israel, G. D. (2011). Strategies for obtaining survey responses from Extension clients: Exploring the role of e-mail requests. Journal of Extension [on-line], 49(2), Article 3FEA7. Available at: http://www.joe.org/joe/2011june/a7.php.

Israel, G. D. (2013). Combining mail and e-mail contacts to facilitate participation in mixed-mode surveys. Social Science Computer Review, 31, 3, 346–358. doi: 10.1177/0894439312464942. Available at: http://ssc.sagepub.com/content/early/2012/11/26/0894439312464942.

Israel, G. D., Wilson, K. L., and Haller, W. T. (2013). Amount and timing of cash incentives on response to a mail survey. Paper presented at the annual meeting of the Rural Sociological Society, New York, NY, August.

Lamm, A. J., Israel, G. D., and Diehl, D. (2013). A national perspective on the current evaluation activities in Extension. Journal of Extension, 51(1), article 1FEA1. Available at: http://www.joe.org/joe/2013february/a1.php.

Loosveldt, G. 2008. Face-to-face interviews. In D. de Leeuw, J.J. Hox, and D. D. Dillman (Eds.), International handbook of survey methodology (pp. 201–220). New York, NY: Lawrence Erlbaum Associates.

Messer, B. L., and D. A. Dillman. 2011. Using address-based sampling to survey the general public by mail vs. Web plus mail. Public Opinion Quarterly, 75(3), 429–457. doi: 10.1093/poq/nfr02.

Millar, M. M., and D. A. Dillman. 2011. Improving response to Web and mixed-mode surveys. Public Opinion Quarterly, 75(2), 249–269. doi: 10.1093/poq/nfr003.

Sangster, R. L., and B. J. Meekins. 2004. Modeling the likelihood of interviews and refusals: Using call history data to improve efficiency of effort in a national RDD survey. Proceedings of the Joint Statistical Meeting. Retrieved June 13, 2013, from http://www.amstat.org/Sections/Srms/Proceedings/y2004/files/Jsm2004-000520.pdf.

Wilcox, A. S., W. M. Giuliano, and G. D. Israel. 2010. Response rate, nonresponse error, and item nonresponse effects when using financial incentives in wildlife questionnaire surveys. Human Dimensions of Wildlife, 15(4), 288–295.

Tables

Table 1. 

Typical contact procedure by mode.

Contact

Face-to-face interview

Telephone survey

Paper and pen survey for groups

Paper and pen survey by mail

Web-hosted survey

Mixed-mode survey

1st

Call to set up interview

Initial call for interview

Verbal introduction of survey and administration

Pre-letter alerting person

Email message with link to URL of the survey

Either mailed pre-letter or emailed message with link

2nd

Call if necessary

Call to non-respondents

 

Survey packet: cover letter, questionnaire, and return envelope

Email message with link to URL of the survey to non-respondents

Email message with link to URL of the survey to non-respondents

3rd

 

Call to non-respondents

 

Reminder post card

Email message with link to URL of the survey to non-respondents

Email message with link to URL of the survey to non-respondents

4th

 

Call to non-respondents

 

Survey packet to non-respondents

Email message with link to URL of the survey to non-respondents

Survey packet to non-respondents

5th

 

Call to non-respondents

 

Survey packet to non-respondents

Email message with link to URL of the survey to non-respondents

Survey packet to non-respondents

6th

or more

 

Call to non-respondents

       

Footnotes

1.

This document is AEC394, one of a series of the Agricultural Education and Communication Department, UF/IFAS Extension. Original publication date October 2013. Visit the EDIS website at http://edis.ifas.ufl.edu.

2.

Glenn D. Israel, professor, and Jessica L. Gouldthorpe, doctoral candidate; Agricultural Education and Communication Department; UF/IFAS Extension, Gainesville, FL 32611. The authors wish to thank Cheri Brodeur, Alexa Lamm, Marilyn Smith, and Robert Torres for their helpful suggestions on an earlier draft.


The Institute of Food and Agricultural Sciences (IFAS) is an Equal Opportunity Institution authorized to provide research, educational information and other services only to individuals and institutions that function with non-discrimination with respect to race, creed, color, religion, age, disability, sex, sexual orientation, marital status, national origin, political opinions or affiliations. For more information on obtaining other UF/IFAS Extension publications, contact your county's UF/IFAS Extension office.

U.S. Department of Agriculture, UF/IFAS Extension Service, University of Florida, IFAS, Florida A & M University Cooperative Extension Program, and Boards of County Commissioners Cooperating. Nick T. Place, dean for UF/IFAS Extension.