Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration Survey
Purchase on Springer.com
$39.95 / €34.95 / £29.95*
Rent the article at a discountRent now
* Final gross prices may vary according to local VAT.
This study examined whether demographic question placement affects demographic and non-demographic question completion rates, non-demographic item means, and blank questionnaire rates using a web-based survey of Veterans Health Administration employees.
Data were taken from the 2010 Voice of the Veterans Administration Survey (VoVA), a voluntary, confidential, web-based survey offered to all VA employees. Participants were given two versions of the questionnaires. One version had demographic questions placed at the beginning and the other version had demographic questions placed at the end of the questionnaire.
Results indicated that placing demographic questions at the beginning of a questionnaire increased item response rate for demographic items without affecting the item response rate for non-demographic items or the average of item mean scores.
In addition to validity issues, a goal for surveyors is to maximize response rates and to minimize the number of missing responses. It is therefore important to determine which questionnaire characteristics affect these values. Results of this study suggest demographic placement is an important factor.
There are various opinions about the most advantageous location of demographic questions in questionnaires; however, the issue has rarely been examined empirically. This study uses an experimental design and a large sample size to examine the effects of demographic placement on survey response characteristics.
- Babbie, E. (2008). The basics of social research (4th ed.). Belmont, CA: Thomson Wadsworth.
- Beatty, P., & Herrmann, D. (2002). To answer or not to answer: Decision processes related to survey item nonresponse. In R. M. Groves, et al. (Eds.), Survey nonresponse (pp. 71–85). New York: John Wiley and Sons.
- Borg, I., Braun, M., & Baumgärtner, M. K. (2008). Attitudes of demographic item non-respondents in employee surveys. International Journal of Manpower, 29, 149–160. doi:10.1108/01437720810872703. CrossRef
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.
- Colton, D., & Covert, R. W. (2007). Designing and constructing instruments for social research and evaluation. San Francisco, CA: John Wiley and Sons, Inc.
- Couper, M. (1997). Survey introduction and data quality. Public Opinion Quarterly, 61, 317–338. CrossRef
- Dillman, D. A. (2007). Mail and internet surveys: The tailored design method (2nd ed.). Hoboken, NJ: John Wiley and Sons, Inc.
- Drummond, F., Sharp, L., Carsin, A., Kelleher, T., & Comber, H. (2008). Questionnaire order significantly increased response to a postal survey sent to primary care physicians. Journal of Epidemiology, 61, 177–185. doi:10.1016/j.jclinepi.2007.04.012.
- Fink, A., Bourque, L. B., & Fielder, E. P. (2003). The survey kit: How to conduct self-administered and mail surveys (2nd ed.). Thousand Oaks, CA: Sage Publications, Inc.
- Frick, A., Bächtiger, M. T., & Reips, U.-D. (1999). Financial incentives, personal information and drop-out rate in online studies. In U.-D. Reips et al. (Eds.), Current Internet science. Trends, techniques, results. Zurich: Online Press. Retrieved from http://www.gor.de/gor99/tband99/pdfs/a_h/frick.pdf.
- Gesell, S. B., Drain, M., & Sullivan, M. P. (2007). Test of a Web and paper employee satisfaction survey: Comparison of respondents and non-respondents. International Journal of Internet Science, 2, 45–58.
- Giles, W. F., & Feild, H. S. (1978). Effects of amount, format, and location of demographic information on questionnaire return rate and response bias of sensitive and nonsensitive items. Personnel Psychology, 31, 549–559. doi:10.1111/j.1744-6570.tb00462.x. CrossRef
- Green, R. G., Murphy, K. D., & Snyder, S. M. (2000). Should demographics be placed at the end or at the beginning of mailed questionnaires? An empirical answer to a persistent methodological question. Social Work Research, 24, 237–241. CrossRef
- Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72, 167–187. CrossRef
- Jackson, S. L. (2009). Research methods and statistics: A critical thinking approach (3rd ed., p. 90). Belmont, CA: Wadsworth.
- McColl, E., Jacoby, A., Thomas, L., Soutter, J., Bamford, C., Steen, N., et al. (2001). Design and use of questionnaires: A review of best practice applicable to surveys of health service staff and patients. Health Technology Assessment 5(31), 59. Retrieved from http://www.hta.ac.uk/fullmono/mon531.pdf.
- Peytchev, A. (2009). Survey breakoff. Public Opinion Quarterly, 73, 74–97. doi:10.1093/poq/nfp014. CrossRef
- Roberson, M. T., & Sundstrom, E. (1990). Questionnaire design, return rates, and response favorableness in an employee attitude questionnaire. Journal of Applied Psychology, 75, 354–357. doi:10.1037/0021-9010.75.3.354. CrossRef
- Rogelberg, S. G., Conway, J. M., Sederburg, M. E., Spitzmüller, C., Aziz, S., & Knight, W. E. (2003). Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology, 88, 1104–1114. doi:10.1037/0021-9010.88.6.1104. CrossRef
- Rogelberg, S. C., Spitzmüller, C., Little, I., & Reeve, C. L. (2006). Understanding response behavior to a special topics organizational satisfaction survey. Personnel Psychology, 59, 903–923. CrossRef
- Rogelberg, S. G., & Stanton, J. M. (2007). Introduction: Understanding and dealing with organizational survey nonresponse. Organizational Research Methods, 10, 195–209. doi:10.1177/1094428106294693. CrossRef
- Shoemaker, P. J., Eichholz, M., & Skewes, E. (2000). Item nonresponse: Distinguishing between don’t know and refuse. International Journal of Public Opinion, 2, 193–201. doi:10.1093/ijpor/14.2.193.
- Spitzmüller, C. & Glenn, D. M. (2006). Organizational survey nonresponse: previous findings and an integrative framework. In Braun, M., & P. Ph. Mohler (Eds.), Beyond the horizon of measurement: Festschrift in honor of Ingwer Borg (pp. 139–161). ZUMA, Mannheim. Retrieved from http://www.gesis.org/fileadmin/upload/forschung/publikationen/zeitschriften/zuma_nachrichten_spezial/znspezial10.pdf.
- Spitzmüller, C., Glenn, D. M., Barr, C. D., Rogelberg, S. G., & Daniel, P. (2006). “If you treat me right, I reciprocate”: examining the role of exchange in organizational survey response. Journal of Organizational Behavior, 27, 19–35. doi:10.1002/job.363. CrossRef
- Stinchcombe, A. L., Jones, C., & Sheatsley, P. (1981). Nonresponse bias for attitude questions. Public Opinion Quarterly, 45, 359–375. CrossRef
- Stoutenbourgh, J. W. (2008). Demographic measures. In P. J. Lavrakas (Ed.), Encyclopedia of survey research methods (Vol. 1, pp. 185–186). Thousand Oaks, CA: Sage Publications, Inc.
- Whitley, B. E. (1996). Principles of research in behavioral research (2nd ed.). New York: McGraw-Hill.
- Yan, T., & Curtin, R. (2010). The relation between unit nonresponse and item nonresponse: A response continuum perspective. International Journal of Public Opinion Research, 22, 535–551. doi:10.1093/ijpor/edq037. CrossRef
- Young, G. J. (2000). Managing organizational transformations: Lessons from the Veterans Health Administration. California Management Review, 43, 66–82.
- Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration Survey
Journal of Business and Psychology
Volume 27, Issue 3 , pp 281-290
- Cover Date
- Print ISSN
- Online ISSN
- Springer US
- Additional Links
- Demographic placement
- Web-based surveys
- Item response rate
- Veterans Health Administration
- Industry Sectors
- Author Affiliations
- 1. Veterans Health Administration-National Center for Organization Development, 11500 Northlake Drive, Suite 230, Cincinnati, OH, 45249, USA
- 2. Xavier University, 3800 Victory Parkway, Cincinnati, OH, 45207-6511, USA