Research in Higher Education

, Volume 56, Issue 2, pp 166–177 | Cite as

Living with Smartphones: Does Completion Device Affect Survey Responses?



With the growing reliance on tablets and smartphones for internet access, understanding the effects of completion device on online survey responses becomes increasing important. This study uses data from the Strategic National Arts Alumni Project, a multi-institution online alumni survey designed to obtain knowledge of arts education, to explore the effects of what type of device (PC, Mac, tablet, or smartphone) a respondent uses has on his/her responses. Differences by device type in the characteristics of survey respondents, survey completion, time spent responding, willingness to answer complex and open-ended questions, and lengths of open-ended responses are discussed.


Smartphones Completion device Survey response 


  1. Atrostic, B. K., Bates, N., Burt, G., & Silberstein, A. (2001). Nonresponse in U.S. government household surveys: Consistent measure, recent trends, and new insights. Journal of Official Statistics, 17(2), 209–226.Google Scholar
  2. Baruch, Y. (1999). Response rates in academic studies: A comparative analysis. Human Relations, 52, 421–434.Google Scholar
  3. Buskirk, T.D., & Andrus, C. (2012). Smart surveys for smart phones: Exploring various approaches for conducting online mobile surveys via smartphones. Survey Practice, 5(1).
  4. Cabrera, A. F., Weerts, D. J., & Zulick, B. J. (2005). Making an impact with alumni surveys. New Directions for Institutional Research, 2005, 5–17. doi: 10.1002/ir.144.CrossRefGoogle Scholar
  5. Couper, M. P. (2000). Web surveys: A review of issues and approaches. Public Opinion Quarterly, 64, 464–494.CrossRefGoogle Scholar
  6. Couper, M. P., Traugott, M. W., & Lamias, M. J. (2001). Web survey design and administration. Public Opinion Quarterly, 65, 230–253.CrossRefGoogle Scholar
  7. de Bruijne, M., & Wijnant, A. (2014). Mobile response in web panels. Social Science Computer Review, 0894439314525918.Google Scholar
  8. Denscombe, M. (2006). Web-based questionnaires and the mode effect: An evaluation based on completion rates and data contents of near-identical questionnaires delivered in different modes. Social Science Computer Review, 24(2), 246–254. doi: 10.1177/0894439305284522.CrossRefGoogle Scholar
  9. Dillman, D. A. (2007). Mail and internet surveys: The Tailored Design Method (2nd ed.). Hoboken: John Wiley & Sons Inc.Google Scholar
  10. Field, A. (2009). Discovering statistics using SPSS (3rd ed.). London: Sage Publications.Google Scholar
  11. Kaye, B. K., & Johnson, T. J. (1999). Taming the cyber frontier: Techniques for improving online surveys. Social Science Computer Review, 17, 323–337.CrossRefGoogle Scholar
  12. Kuh, G. D., & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), 1–20.CrossRefGoogle Scholar
  13. Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education, Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment.Google Scholar
  14. Lambert, A. D., & Miller, A. L. (2014). Lower response rates on alumni surveys might not mean lower response representativeness. Educational Research Quarterly, 37(3), 38–51.Google Scholar
  15. Mavletova, A. (2013). Data quality in PC and mobile web surveys. Social Science Computer Review, 31(6), 725–743. doi: 10.1177/0894439313485201.CrossRefGoogle Scholar
  16. Miller, T. I., Miller Kobayashi, M., Caldwell, E., Thurston, S., & Collett, B. (2002). Citizen surveys on the web. Social Science Computer Review, 20(2), 124–136.CrossRefGoogle Scholar
  17. Norman, K. L., Friedman, Z., Norman, K., & Stevenson, R. (2001). Navigational issues in the design of online self-administered questionnaires. Behavior & Information Technology, 20(1), 37–45. doi: 10.1080/01449290010021764.CrossRefGoogle Scholar
  18. Peytchev, A. (2009). Survey breakoff. Public Opinion Quarterly, 73(1), 74–97. doi: 10.1093/poq/nfp014.CrossRefGoogle Scholar
  19. Peytchev, A., Couper, M. P., McCabe, S. E., & Crawford, S. D. (2006). Web survey design: Paging versus scrolling. Public Opinion Quarterly, 70(4), 596–607. doi: 10.1093/poq/nfl028.CrossRefGoogle Scholar
  20. Peytchev, A., & Hill, C. A. (2010). Experiments in mobile web survey design: Similarities to other modes and unique considerations. Social Science Computer Review, 28(3), 319–335. doi: 10.1177/0894439309353037.CrossRefGoogle Scholar
  21. Porter, S. R. (2004). Raising response rates: What works? New Directions for Institutional Research, 121, 5–21.CrossRefGoogle Scholar
  22. Saxon, D., Garratt, D., Gilroy, P., & Cairns, C. (2003). Collecting data in the information age: Exploring web-based survey methods in educational research. Research in Education, 69, 51–66.CrossRefGoogle Scholar
  23. Tourangeau, R. (2004). Survey research and societal change. Annual Review of Psychology, 55, 775–801. doi: 10.1146/annurev.psych/55.090902.142040.CrossRefGoogle Scholar
  24. Tourangeau, R., Couper, M. P., & Steiger, D. M. (2003). Humanizing self-administered surveys: Experiments on social presence in web and IVR surveys. Computers in Human Behavior, 19, 1–24.CrossRefGoogle Scholar
  25. Villar, A., Callegaro, M., & Yang, Y. (2013). Where am I? A meta-analysis of experiments on the effects of progress indicators for web surveys. Social Science Computer Review, 31(6), 744–762. doi: 10.1177/0894439313497468.CrossRefGoogle Scholar
  26. Yan, T., Conrad, F. G., Tourangeau, R., & Couper, M. P. (2010). Should I stay or should I go: The effects of progress feedback, promised task duration, an length of questionnaire on completing web surveys. International Journal of Public Opinion Research, 23(2), 131–147. doi: 10.1093/ijpor/edq046.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Indiana UniversityBloomingtonUSA

Personalised recommendations