Five Tips from the UIC Survey Research Laboratory

The Survey News Bulletin
Post date: 
May 11, 2015

1. Avoid using some/other questions

In a recent article examining the use of "some/other" questions (i.e., those that are introduced by saying that "Some people think that…, but other people think…."), researchers concluded that these questions increase question complexity and length without improving the validity of responses. Respondents took longer to answer these questions, but there was little evidence that they improved respondents answers. The authors recommend instead short, direct questions that avoid unconventional response option order.

Yeager, D. S., & Krosnick, J. A. (2012). Does mentioning "some people" and "other people" in an opinion question improve measurement quality? Public Opinion Quarterly, 76, 131-141.

2. Give clarifying information before the question

Researchers often want to provide instructions for respondents to clarify inclusion/exclusion criteria when asking a question in a survey (e.g., respondents must know what to count or not count as a "shoe" in a question asking them about the number of pairs of shoes they own). Recent evidence suggests that it is better to give instructions before the question than after the question, but that decomposing the question into composite parts (e.g., asking separate questions about different types of shoes) may result in the most accurate responses, although asking multiple questions took longer than providing instructions.

Redline, C. (2013). Clarifying categorical concepts in a Web survey. Public Opinion Quarterly, 77, 81-105.

3. Know the standard response rate for your type of study

The 7th edition of the Standard Definitions document, published in 2011 by the American Association for Public Opinion Research (AAPOR), contains standardized formulas for the calculation of response rates, cooperation rates, and refusal rates for telephone, in-person, mail and internet surveys. Use of these formulas facilitates meaningful comparisons across surveys and many professional journals now require their use when reporting findings from primary survey data collection efforts. We strongly encourage their use. For more information, see

The American Association for Public Opinion Research. (2011). Standard definitions: Final dispositions of case codes and outcome rates for surveys (7th ed.). AAPOR.

4. Budget accordingly

Developing a survey requires trade-offs between data quality and the cost of obtaining the data. Beyond the labor costs associated with professional time for survey/sample design, questionnaire development and data analysis, there are myriad expenses that apply depending on your mode of data collection and study design. Whether you are fielding your own data collection effort or hiring a professional survey research firm to collect data, here is a sample list of expenses you should anticipate:

  • Questionnaire length (for mail surveys, affects printing, postage, and data entry costs; for in-person and telephone interviews, affects total interview time and interviewer costs)
  • Geographic dispersion of the sample
  • Whether screening is required to find target population
  • Printing costs
  • Postage
  • Telephone charges
  • Materials (envelopes, paper)
  • Translation of questionnaire and recruitment materials into other languages
  • Pretesting
  • Respondent incentives and disbursement (if they are mailed to subjects later, labor, materials, and postage on top of the incentive)
  • Number of contact attempts
  • Interviewer travel time and mileage
  • Validation
  • Data entry and processing
  • Software license fees for Web or computer-assisted interview questionnaires
  • Equipment purchases (computers or other electronic data collection tools such as tablets for in-person surveys in particular)
Blair, J. E., Czaja, R. F., & Blair, E. A. (2013). Designing surveys (3rd ed., pp. 337-343). Thousand Oaks, CA: Sage.

5. Be realistic about your data collection timeline

Everyone wants their data fast. It takes time to design and conduct a high quality survey. A typical Web survey can take 3-5 months; a mail survey, 4-6 months; and the time needed to conduct a telephone or face-to-face survey depends on many factors that don't lend themselves to predictable time frames (such as geographical dispersion of the population and sample size). Steps that affect data collection time include the following:

  • Questionnaire development and testing (especially when new questions or measures are being developed or if programmed for self-administration via Web, or interviewer-administration in CAI software). This can be particularly time consuming if many stakeholders are involved in the questionnaire development process.
  • Sample frame development.
  • IRB review and approval (always leave time to respond to modifications! Do not expect approval upon initial submission).
  • Cognitive pretesting.
  • A thorough pilot study.
  • Time to amend the IRB protocol based on the pilot study.
  • Adequate time to collect data--plan for more than you think (you may decide to do another mailing, for example, if returns have been slow to come in).
  • Time for data processing and cleaning before the final data set is ready.

The UIC Survey Research Laboratory routinely distributes, free of charge, brief email bulletins regarding best practices in the conduct of survey research. These news bulletins are distributed once a week during the academic year to faculty, staff and students on each of the University of Illinois campuses.

Subscribe to the Survey News Bulletin
Access previous bulletins