Around survey methodology and questionnaire design

2008-03-24

Recently, I read some books related to survey design and the way a questionnaire should be constructed so as to maximize the information we want to gain from it.

Among others, I found Willis’s book, Cognitive Interviewing, A Tool for Improving Questionnaire Design, 2005, well structured, though some topics are developed far from what I would qualify as a really concise overview. Other lectures included more statistical techniques oriented toward survey design and implementation. This includes a concise (but very dense!) book written by E. S. Lee and R. N. Forthofer, Analyzing Complex Survey Data (2006, Sage Publications).

Also, I’ve reviewed various articles related to survey methodology for large-scale educational assessments (see references (5) and ff.). Apart from focusing on the precision of the related estimators, an interesting issue arises when one wants to design effective sampling using a computer. Indeed, when conducting a survey, we have to sample the units without replacement, which is quite different from the usual asymptotic approach in behavioral studies where we mainly use random sampling with replacement (hence, assuming an infinite population).

Before presenting some of the main issues discussed in Willis’s book, the interested reader may refer to the National Cancer Institute where some tutorials on Item Response Theory and Questionnaire Design are proposed. In fact, there is a short PDF article of Gordon Willis, “Cognitive Interviewing, A “How to” Guide”:http://appliedresearch.cancer.gov/areas/cognitive/interview.pdf. I also found these PDF guidelines on the Cooperative State Research, Education, and Extension Service. It’s not too surprising to find such material since cognitive interviewing has been mainly developed in the context of medical or psyhcological assessement. Other references are given at the end of this post. (1,2) For a general overview of survey methodology, please refer to (13).

Willis’s book is really intended to all questionnaire designers who are interested in how cognitive strategies might affect verbal or web responses.

The following book might also be of potential interest for the questionnaire/survey designer: Designing and Conducting Health Surveys: A Comprehensive Guide, by Aday and Cornelius (2006). The TOC can be visualized here in PDF, and there is an online chapter excerpt (Chapter 1, in PDF) on Jossey-Bass website: Thinking about topics for health surveys.

Further, here is a general annotated bibliography on survey methodology (aims, methods, caveats, etc.), available as BibTeX file or HTML page.

There are several ways to implement on-line questionnaire. Indeed, various technologies have now become deployed over the internet, including usual html/PHP web form (static or dynamic), Perl CGI, Macromedia Flash, or even Java for mobile device. I will propose soon some kind of demonstrations of each of these tools one day…

References

  1. Hughes, K.A. (2004). Comparing pretesting methods: Cognitive interviews, respondent debriefing, and behavior coding. Report of Statistical Research Division, Survey Methodology #2004-02.
  2. Waddington, P.A.J. and Bull, R. (2007). Cognitive interview as a research technique. Social Research Update, vol. 50.
  3. Ackermann, A.C. and Blair, J. (2006). Efficient respondent selection for cognitive interviewing. American Association for Public Opinion Research Conference.
  4. Conrad, F., Blair, J., and Tracy, E. (1999). Verbal reports are data! A theoretical approach to cognitive interviews. Federal Committee on Statistical Methodology Conference. (All proceedings are available to download)
  5. Wilson, M. and Adams, R. (1995). Issues in complex sampling latent variables. Proceedings of the Survey Research Methods Section of the American Statistical Association. (All proceedings from 1978 to 2007 are available to download)
  6. Heuer, R., Doherty, J., and Zwieg, E. (2007). Interview timing data: Simple yet powerful survey instrument development tools. Proceedings of the 62nd Annual Conference of the American Association for Public Opinion Research (AAPOR).
  7. Blair, J. and Presser, S. (1993). Survey procedures for conducting cognitive interviews to pretest questionnaires: A review of theory and practice. Proceedings of the Survey Research Methods Section of the American Statistical Association. (See Note above)
  8. Petroni, R., Sigman, R., Willimack, D., Cohen, S., and Tucker, C. (2004). Response rates and nonresponse in Establishment surveys — BLS and Census Bureau. Presentation to the Federal Economic Statistics Advisory Committee.
  9. Morrisson, R.L., Anderson, A.E., and Brady, C.F. (2005). The effect of data collection software on the cognitive survey response process. Proceedings of the Survey Research Methods Section of the American Statistical Association. (See Note above)
  10. Nichols, E. and Childs, J.H. (2007). Respondent debriefings conducted by experts: A new qualitative methodology for questionnaire evaluation. Report of Statistical Research Division, Survey Methodology #2007-39.
  11. Clark, H.H. and Shaeffer, E.F. (1989). Contributing to discourse. Cognitive Science, 13, 259-294.
  12. Dillman, D.A. and Bowker, D.K. (2001). The Web questionnaire challenge to survey methodologists. In Dimensions of Internet Science, U.-D. Reips & M. Bosnjak (Eds.). Pabst Science Publishers.
  13. Groves, R.M., Fowler, F.J., Couper, M.P., Lepkwoski, J.M., Singer, E., and Tourangeau, R. (2004). Survey Methodology. John Wiley & Sons.
  14. Schaeffer, N.C. and Presser, S. (2003). The science of asking questions. Annual Review of Sociology, 29, 65-88.
---

Articles with the same tag(s):

Data Science from Scratch
Stata for health researchers
R Graphs Cookbook
Bad Data
Data science at the command-line
Reproducible research with R
Twenty canonical questions in machine learning
Do a large amount of consulting
Dose finding studies and cross-over trials
Evidence-based medicine and clinical diagnosis

---