Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration Survey

Abstract

Purpose

This study examined whether demographic question placement affects demographic and non-demographic question completion rates, non-demographic item means, and blank questionnaire rates using a web-based survey of Veterans Health Administration employees.

Methodology

Data were taken from the 2010 Voice of the Veterans Administration Survey (VoVA), a voluntary, confidential, web-based survey offered to all VA employees. Participants were given two versions of the questionnaires. One version had demographic questions placed at the beginning and the other version had demographic questions placed at the end of the questionnaire.

Findings

Results indicated that placing demographic questions at the beginning of a questionnaire increased item response rate for demographic items without affecting the item response rate for non-demographic items or the average of item mean scores.

Implications

In addition to validity issues, a goal for surveyors is to maximize response rates and to minimize the number of missing responses. It is therefore important to determine which questionnaire characteristics affect these values. Results of this study suggest demographic placement is an important factor.

Originality/Value

There are various opinions about the most advantageous location of demographic questions in questionnaires; however, the issue has rarely been examined empirically. This study uses an experimental design and a large sample size to examine the effects of demographic placement on survey response characteristics.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2

References

  1. Babbie, E. (2008). The basics of social research (4th ed.). Belmont, CA: Thomson Wadsworth.

    Google Scholar 

  2. Beatty, P., & Herrmann, D. (2002). To answer or not to answer: Decision processes related to survey item nonresponse. In R. M. Groves, et al. (Eds.), Survey nonresponse (pp. 71–85). New York: John Wiley and Sons.

    Google Scholar 

  3. Borg, I., Braun, M., & Baumgärtner, M. K. (2008). Attitudes of demographic item non-respondents in employee surveys. International Journal of Manpower, 29, 149–160. doi:10.1108/01437720810872703.

    Article  Google Scholar 

  4. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  5. Colton, D., & Covert, R. W. (2007). Designing and constructing instruments for social research and evaluation. San Francisco, CA: John Wiley and Sons, Inc.

    Google Scholar 

  6. Couper, M. (1997). Survey introduction and data quality. Public Opinion Quarterly, 61, 317–338.

    Article  Google Scholar 

  7. Dillman, D. A. (2007). Mail and internet surveys: The tailored design method (2nd ed.). Hoboken, NJ: John Wiley and Sons, Inc.

    Google Scholar 

  8. Drummond, F., Sharp, L., Carsin, A., Kelleher, T., & Comber, H. (2008). Questionnaire order significantly increased response to a postal survey sent to primary care physicians. Journal of Epidemiology, 61, 177–185. doi:10.1016/j.jclinepi.2007.04.012.

    Google Scholar 

  9. Fink, A., Bourque, L. B., & Fielder, E. P. (2003). The survey kit: How to conduct self-administered and mail surveys (2nd ed.). Thousand Oaks, CA: Sage Publications, Inc.

    Google Scholar 

  10. Frick, A., Bächtiger, M. T., & Reips, U.-D. (1999). Financial incentives, personal information and drop-out rate in online studies. In U.-D. Reips et al. (Eds.), Current Internet science. Trends, techniques, results. Zurich: Online Press. Retrieved from http://www.gor.de/gor99/tband99/pdfs/a_h/frick.pdf.

  11. Gesell, S. B., Drain, M., & Sullivan, M. P. (2007). Test of a Web and paper employee satisfaction survey: Comparison of respondents and non-respondents. International Journal of Internet Science, 2, 45–58.

    Google Scholar 

  12. Giles, W. F., & Feild, H. S. (1978). Effects of amount, format, and location of demographic information on questionnaire return rate and response bias of sensitive and nonsensitive items. Personnel Psychology, 31, 549–559. doi:10.1111/j.1744-6570.tb00462.x.

    Article  Google Scholar 

  13. Green, R. G., Murphy, K. D., & Snyder, S. M. (2000). Should demographics be placed at the end or at the beginning of mailed questionnaires? An empirical answer to a persistent methodological question. Social Work Research, 24, 237–241.

    Article  Google Scholar 

  14. Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72, 167–187.

    Article  Google Scholar 

  15. Jackson, S. L. (2009). Research methods and statistics: A critical thinking approach (3rd ed., p. 90). Belmont, CA: Wadsworth.

    Google Scholar 

  16. McColl, E., Jacoby, A., Thomas, L., Soutter, J., Bamford, C., Steen, N., et al. (2001). Design and use of questionnaires: A review of best practice applicable to surveys of health service staff and patients. Health Technology Assessment 5(31), 59. Retrieved from http://www.hta.ac.uk/fullmono/mon531.pdf.

  17. Peytchev, A. (2009). Survey breakoff. Public Opinion Quarterly, 73, 74–97. doi:10.1093/poq/nfp014.

    Article  Google Scholar 

  18. Roberson, M. T., & Sundstrom, E. (1990). Questionnaire design, return rates, and response favorableness in an employee attitude questionnaire. Journal of Applied Psychology, 75, 354–357. doi:10.1037/0021-9010.75.3.354.

    Article  Google Scholar 

  19. Rogelberg, S. G., Conway, J. M., Sederburg, M. E., Spitzmüller, C., Aziz, S., & Knight, W. E. (2003). Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology, 88, 1104–1114. doi:10.1037/0021-9010.88.6.1104.

    PubMed  Article  Google Scholar 

  20. Rogelberg, S. C., Spitzmüller, C., Little, I., & Reeve, C. L. (2006). Understanding response behavior to a special topics organizational satisfaction survey. Personnel Psychology, 59, 903–923.

    Article  Google Scholar 

  21. Rogelberg, S. G., & Stanton, J. M. (2007). Introduction: Understanding and dealing with organizational survey nonresponse. Organizational Research Methods, 10, 195–209. doi:10.1177/1094428106294693.

    Article  Google Scholar 

  22. Shoemaker, P. J., Eichholz, M., & Skewes, E. (2000). Item nonresponse: Distinguishing between don’t know and refuse. International Journal of Public Opinion, 2, 193–201. doi:10.1093/ijpor/14.2.193.

    Google Scholar 

  23. Spitzmüller, C. & Glenn, D. M. (2006). Organizational survey nonresponse: previous findings and an integrative framework. In Braun, M., & P. Ph. Mohler (Eds.), Beyond the horizon of measurement: Festschrift in honor of Ingwer Borg (pp. 139–161). ZUMA, Mannheim. Retrieved from http://www.gesis.org/fileadmin/upload/forschung/publikationen/zeitschriften/zuma_nachrichten_spezial/znspezial10.pdf.

  24. Spitzmüller, C., Glenn, D. M., Barr, C. D., Rogelberg, S. G., & Daniel, P. (2006). “If you treat me right, I reciprocate”: examining the role of exchange in organizational survey response. Journal of Organizational Behavior, 27, 19–35. doi:10.1002/job.363.

    Article  Google Scholar 

  25. Stinchcombe, A. L., Jones, C., & Sheatsley, P. (1981). Nonresponse bias for attitude questions. Public Opinion Quarterly, 45, 359–375.

    Article  Google Scholar 

  26. Stoutenbourgh, J. W. (2008). Demographic measures. In P. J. Lavrakas (Ed.), Encyclopedia of survey research methods (Vol. 1, pp. 185–186). Thousand Oaks, CA: Sage Publications, Inc.

    Google Scholar 

  27. Whitley, B. E. (1996). Principles of research in behavioral research (2nd ed.). New York: McGraw-Hill.

    Google Scholar 

  28. Yan, T., & Curtin, R. (2010). The relation between unit nonresponse and item nonresponse: A response continuum perspective. International Journal of Public Opinion Research, 22, 535–551. doi:10.1093/ijpor/edq037.

    Article  Google Scholar 

  29. Young, G. J. (2000). Managing organizational transformations: Lessons from the Veterans Health Administration. California Management Review, 43, 66–82.

    Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Robert Teclaw.

Additional information

The contents do not necessarily represent the views of the Department of Veterans Affairs or the US Government.

Appendix

Appendix

Demographic Questions (Common to All Questionnaires)

  1. 1.

    Please choose your occupation from the following list. (Choose only one.)

    • (List of 47 occupation codes.)

  2. 2.

    What is your Gender?

    • Male

    • Female

  3. 3.

    What is your age?

    • Less than 20 years

    • 20–29

    • 30–39

    • 40–49

    • 50–59

    • 60 years or older

  4. 4.

    Are you Spanish, Hispanic or Latino?

    • Yes

    • No

  5. 5.

    What is your race? (Mark one or more)

    • White

    • Black or African American

    • American Indian or Alaskan Native

    • Asian

    • Native Hawaiian or other Pacific Islander

  6. 6.

    How long have you been with VA?

    • Less than six months

    • Between six months and one year

    • Between one and two years

    • Between two and five years

    • Between five and ten years

    • Between ten and fifteen years

    • Between fifteen and twenty years

    • More than twenty years

  7. 7.

    What is your level of supervisory responsibility?

    • None

    • Team Leader (informal; not responsible for performance ratings)

    • First Line Supervisor (formal; rates performance, e.g.: Foremen, Section Chief)

    • Manager (formal; rates performance, e.g.: Division/Department/Service/Care Line)

    • Executive (formal; rates performance, e.g.: Associate Director, Chief of Staff)

    • Senior Executive (formal; rates performance, e.g.: Network Director, Facility Director)

  8. 8.

    What type of setting do you spend at least 20% of your time in? (You may select up to 5 options)

    • Administrative (non-clinical)

    • Inpatient Care

    • Outpatient Care

    • Extended Care (e.g., Nursing Homes)

    • Research

    • Education

    • Affiliate

  9. 9.

    What is the main type of service you provide? (Please select only one option)

    • Administrative (Non-Clinical)

    • Dental

    • Emergency Medicine (Urgent Care, Emergency Department)

    • Home or Community Care

    • Imagining (Radiology, Nuclear Medicine)

    • Inpatient Medical/Surgical

    • Intensive Care Unit-Critical Care

    • Laboratory and Pathology

    • Medical Specialty

    • Mental Health

    • Nursing Home

    • Pharmacy

    • Primary Care

    • Prosthetics or Sensory Aids

    • Rehabilitation Services

    • Research

    • Spinal Cord Injury

    • Surgery, Anesthesiology or Surgical Specialty Care

    • Other Clinical Service

  10. 10.

    Before becoming a VA employee did you take part in a training or educational program based partly or entirely in VA (such as paid or unpaid internships, residencies, fellowships, or clinical or administrative rotations)?

    • Yes

    • No

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Teclaw, R., Price, M.C. & Osatuke, K. Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration Survey. J Bus Psychol 27, 281–290 (2012). https://doi.org/10.1007/s10869-011-9249-y

Download citation

Keywords

  • Demographic placement
  • Web-based surveys
  • Item response rate
  • Methodology
  • Veterans Health Administration