The Savvy Survey #4: Details in the Design
As part of the Savvy Survey Series, this publication provides an overview of important facets of the survey process. Topics covered include modes for collecting responses, strategies for contacting clients and personalizing contacts, and tips for using incentives. The ability of a survey to gather accurate and useful information for assessing program needs or evaluating program outcomes is greatly influenced by the survey's design. Careful attention to detail is essential.
Modes for Collecting Responses
With the expansion of web access and the development of new communication technologies, Extension faculty now have more options than ever before for conducting surveys. Many types of surveys exist and can be conducted in a variety of situations. There are two basic types of surveys: interviewer-administered and self-administered.
Interviewer-Administered Surveys
Interviewer-administered surveys include:
- Face-to-face interviews with clients to collect in-depth evaluation information or with key informants to identify local needs and assets
- Telephone interviews with clients or a sample of a population
Interviewer-administered surveys give the interviewer the advantage of being able to clarify questions and check the respondents' understanding. On the other hand, interviewers can intentionally or unintentionally influence respondents and introduce bias in the collected data (see Loosveldt, 2009 for more information).
Self-Administered Surveys
Self-administered surveys include:
- Paper and pen questionnaires distributed to clients attending a program to obtain feedback and customer satisfaction
- Paper and pen questionnaires mailed to a sample of a population in a county, region, or state (see, for example, Gaul, Hochmuth, Israel, & Treadwell, 2009; Israel, Easton, & Knox, 1999)
- Online surveys to capture information on knowledge, attitudes or behaviors (see Lamm, Israel, & Diehl, 2013)
- Mixed-mode surveys using email and mail to contact clients about knowledge, attitudes, or behaviors (see Israel, 2010, 2011, 2013; Newberry & Israel, 2017)
Self-administered surveys cost less than interviewer-administered surveys because of the reduced time and labor. In addition, web and mixed-mode surveys can be less expensive than mail surveys because of avoided postage costs. On the other hand, some clients do not have access to the web or choose not to use it; therefore, web-only surveys should be avoided unless there is universal access (see Israel, 2010). For follow-up surveys with clients, universal access would be indicated by 90%–95% having provided an email address. As discussed in The Savvy Survey #1: Introduction, coverage error becomes a bigger concern as an increasingly larger proportion of the target audience is excluded from the survey.
Self-administered surveys rely on the ability of respondents to interpret questions correctly without the help of an interviewer, which makes it especially important to construct these surveys carefully. Detailed information about writing items for a questionnaire and formatting a questionnaire to aid in respondent completion is available in publications #6a–e and #7 of The Savvy Survey series. Furthermore, additional considerations for using the various survey modes can be found in the following Savvy Survey publications:
The Savvy Survey #10: In-Person-administered Surveys
The Savvy Survey #11: Mail-Based Surveys
The Savvy Survey #12: Telephone Surveys
The Savvy Survey #13: Online Surveys
The Savvy Survey #14: Mixed-mode Surveys
The Savvy Survey #18: Group-administered Surveys
Contact Procedures
One of the principles for conducting a high-quality survey and getting useful data is to make multiple contacts. For a telephone survey, it is common to make 6 or 8 calls to the same number, and some important surveys (such as those conducted by the Bureau of Labor Statistics) make 30 or more calls in attempting to complete a survey (Sangster & Meekins, 2004). Mail and mixed-mode surveys typically use 4 or 5 contacts (see Dillman, Smyth, & Christian, 2014). Regardless of which survey mode is selected, multiple contacts have been shown to increase the number of completed questionnaires.
For each survey mode, the contact procedures are tailored to the situation, as recommended by the Tailored Design Survey Method (Dillman et al., 2014). Table 1 shows a typical contact procedure by survey mode. For face-to-face interviews, contact to set up an interview is by telephone call (typically one or two calls). In comparison, a telephone survey will start with an initial call and then include 5 or more calls at different days/times to reach those who did not complete the survey.
For mail surveys, a pre-letter is often used to alert potential respondents to the survey, followed by the survey packet a few days later. Online surveys often include the link to the survey in the initial contact because of the ephemeral nature of email messages. Because the lifespan of email messages is short, reminders for online surveys should be sent more frequently than those for mail surveys (e.g., after 4 or 5 days as opposed to a couple weeks). Finally, the most effective mixed-mode procedures emphasize responding via the web first and then by mail (Israel 2013; Messer & Dillman, 2011; Millar & Dillman, 2011; Newberry & Israel, 2017).
Personalizing the Survey Process
Research shows that personalization has a small but significant effect on increasing response rates (Dillman et al., 2014). Personalization helps to connect the respondent to the survey. Personalize for each respondent by:
- Using the client's name in contact messages (e.g., Dear Joe Client), or using a group name with which clients identify (e.g., Dear Jackson County Cattlemen)
- Individually signing letters in blue ink for postal contacts (which shows the importance of the survey to the respondent)
- Using logos, pictures, or graphics on the questionnaire that are tailored to the targeted group to increase the salience of the survey
Adding individual names and other information to contact messages is fairly easy and fast. Spreadsheet files containing names and addresses can be used with word processing applications to do a mail merge to produce a personalized letter or email message for a client (See The Savvy Survey #11: Mail-Based Surveys and The Savvy Survey #13: Online Surveys for examples).
Incentives
Besides the number of contacts, incentives have been shown to increase the number of completed surveys more than any other tactic. Of these, monetary incentives of $1 to $5 delivered with the request to complete the questionnaire are the most effective because they build trust and invoke a norm of reciprocity (Dillman et al., 2014). Unfortunately, monetary incentives are often unavailable or impractical for Extension faculty. When they are feasible, monetary incentives can increase response rates between 9 and 20% for surveys of Extension audiences (Israel, Wilson, & Haller, 2013; Wilcox, Guilliano, & Israel, 2010). However, nonmonetary incentives have also been used (such as bookmarks, pamphlets, and fact sheets), but these tokens of appreciation are less effective.
In Summary
This publication in the Savvy Survey Series has introduced modes for collecting responses to a survey, described the importance of tailoring the contact process to the survey situation while making multiple contacts to maximize the number of completed surveys, and reviewed techniques for personalizing a survey. It also provided information about whether to use an incentive with the survey. Attending to these details of the survey process can significantly impact the amount and quality of data collected and, in turn, its usefulness in assessing needs or evaluating program outcomes.
References
Dillman, D. A., J. D. Smyth, & L. M. Christian. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method. (4th ed.). Hoboken, NJ: John Wiley and Sons.
Gaul, S. A., R. C. Hochmuth, G. D. Israel, & D. Treadwell. (2009). Characteristics of small farm operators in Florida: Economics, demographics, and preferred information channels and sources. WC088. Gainesville: University of Florida Institute of Food and Agricultural Sciences. Available at: https://edis.ifas.ufl.edu/publication/wc088
Israel, G. D., J. O. Easton, & G. W. Knox. (1999). Adoption of landscape management practices by Florida citizens. HortTechnology, 9(2), 262–266. https://doi.org/10.21273/HORTTECH.9.2.262
Israel, G. D. (2010). Using Web-hosted surveys to obtain responses from Extension clients: A cautionary tale. Journal of Extension, 48(4), Article 4FEA8. Available at: https://archives.joe.org/joe/2010august/a8.php
Israel, G. D. (2011). Strategies for obtaining survey responses from Extension clients: Exploring the role of e-mail requests. Journal of Extension, 49(3), Article 3FEA7. Available at: https://archives.joe.org/joe/2011june/a7.php
Israel, G. D. (2013). Combining mail and e-mail contacts to facilitate participation in mixed-mode surveys. Social Science Computer Review, 31, 3, 346–358. doi: 10.1177/0894439312464942. Available at: https://journals.sagepub.com/doi/abs/10.1177/0894439312464942
Israel, G. D., Wilson, K. L., and Haller, W. T. (2013). Amount and timing of cash incentives on response to a mail survey. Paper presented at the annual meeting of the Rural Sociological Society, New York, NY, August.
Lamm, A. J., Israel, G. D., & Diehl, D. (2013). A national perspective on the current evaluation activities in Extension. Journal of Extension, 51(1), article 1FEA1. Available at: https://archives.joe.org/joe/2013february/a1.php
Loosveldt, G. (2008). Face-to-face interviews. In D. de Leeuw, J.J. Hox, and D. D. Dillman (Eds.), International handbook of survey methodology (pp. 201–220). New York, NY: Lawrence Erlbaum Associates.
Messer, B. L., & D. A. Dillman. (2011). Using address-based sampling to survey the general public by mail vs. Web plus mail. Public Opinion Quarterly, 75(3), 429–457. https://doi.org/10.1093/poq/nfr021
Millar, M. M., & D. A. Dillman. (2011). Improving response to Web and mixed-mode surveys. Public Opinion Quarterly, 75(2), 249–269. https://doi.org/10.1093/poq/nfr003
Newberry, III, M. G., & Israel, G. D. 2017. Comparing Two Web/Mail Mixed-Mode Contact Protocols to a Unimode Mail Survey. Field Methods, 29(3), 281–298. https://doi.org/10.1177/1525822X17693804
Sangster, R. L., & B. J. Meekins. (2004). Modeling the likelihood of interviews and refusals: Using call history data to improve efficiency of effort in a national RDD survey. Proceedings of the Joint Statistical Meeting. Retrieved March 30, 2022, from https://www.bls.gov/osmr/research-papers/2004/pdf/st040090.pdf
Wilcox, A. S., W. M. Giuliano, & G. D. Israel. (2010). Response rate, nonresponse error, and item nonresponse effects when using financial incentives in wildlife questionnaire surveys. Human Dimensions of Wildlife, 15(4), 288–295. https://doi.org/10.1080/10871201003736047