Gamification of Survey Research: Empirical Results from Gamifying a Conjoint Experiment

  • Briana Brownell
  • Jared Cechanowicz
  • Carl Gutwin


One of the most important tools utilized by the marketing research industry is the consumer survey. This self-reported data is the foundation of many currently applied methodologies for measuring the success of marketing campaigns and strategies. As such, suppliers in the marketing research industry rely on the engagement and attentiveness of the individuals who participate in their research and respond to their surveys. Keeping these respondents engaged is important for reducing the drop-off rate (the rate at which respondents quit before completing a survey), increasing time spent on surveys (which is linked to the quality and quantity of responses), and improving respondents’ subjective enjoyment (since a happy respondent is more likely to complete future surveys). There is evidence to suggest that engagement has an influence on data quality as well, since bored or inattentive respondents produce lower quality data (Cape, 2009). Keeping respondents engaged and willing to participate in research is critical both to industry providers and to clients who use the results of the research for their decision making.


Gamification Games Survey Motivation Motivation Market research 


  1. Adamou, B. (n.d.a). Using gamification in youth surveys. Retrieved from
  2. Adamou, B. (n.d.b). Giving research the NOS effect. Proceedings of 2012 Net Gain 6.0 MRIA Conference. Google Scholar
  3. Brown, J. (2003). Survey metrics ward off problems. Marketing News, 17, 17–20.Google Scholar
  4. Cape, P. (2009). Questionnaire length, fatigue effects and response quality revisited. SSI white paper. Google Scholar
  5. Carson, R., Louviere, J., Anderson, D., Arabie, P., Bunch, D., Hensher, D., et al. (1994). Experimental analysis of choice. Marketing Letters, 5(4), 351–368.CrossRefGoogle Scholar
  6. Center for Game Science at University of Washington: Foldit. Retrieved from
  7. Chao, D. (2001). Doom as an interface for process management. Proceedings of CHI 2001 (pp. 152–157).Google Scholar
  8. Chrons, O., & Sundell, S. (2011) Digitalkoot. Making old archives accessible using crowdsourcing. Proceedings of HCOMP. Google Scholar
  9. Chrzan, K., & Terry E. (1995). Partial profile choice experiments: A choice based approach for handling large numbers of attributes. 1995 Advanced Research Techniques Conference Proceedings. Chicago, IL: American Marketing Association.Google Scholar
  10. Deterding, S., Dixon, D., Nacke, L., O’Hara, K., & Sicart, M. (2011). Gamification: Using game design elements in non-gaming contexts. CHI 2011 Ext. Abstracts.Google Scholar
  11. Deutskens, E., de Ruyter, K., Wetzels, M., & Oosterveld, P. (2004). Response rate and response quality of internet-based surveys: An experimental study. Marketing Letters, 15(1), 21–36.CrossRefGoogle Scholar
  12. Dignan, A. (2011). Game frame: Using games as a strategy for success. New York: Free Press.Google Scholar
  13. Evans, J. R., & Mathur, A. (2005). The value of online surveys. Internet Research, 15(2), 195–219.CrossRefGoogle Scholar
  14. Flatla, D. R., Gutwin, C., Nacke, L. E., Bateman, S., & Mandryk, R. L. (2011). Calibration games: Making calibration tasks enjoyable by adding motivating game elements. Proceedings of UIST 2011 (pp. 403–412).Google Scholar
  15. Greenbook Industry Trends Report (2013, Winter).Google Scholar
  16. Guin, T. D., Baker, R., Mechling, J., & Ruylea, E. (2012). Myths and realities of respondent engagement in online surveys. International Journal of Market Research, 54(5), 613–633.CrossRefGoogle Scholar
  17. Herzog, A. R., & Bachman, J. G. (1981). Effects of questionnaire length on response quality. Public Opinion Quarterly, 45, 549–559.CrossRefGoogle Scholar
  18. Hunicke, R., LeBlanc, M., & Zubek, R. (2004). MDA: A formal approach to game design and game research. Proceedings of AAAI04 WS on Challenges in Game AI (pp. 1–5).Google Scholar
  19. Johnson, R., & Orme, B. (1996). How many questions should you ask in choice-based conjoint studies? Sawtooth software research paper series Google Scholar
  20. Jung, J., Schneider, C., & Valacich, J. (2010). Enhancing the motivational affordance of information systems: The effects of real-time performance feedback and goal setting in group collaboration environments. Management Science, 56(4), 724–742.CrossRefGoogle Scholar
  21. Korn, O. (2012). Industrial playgrounds: How gamification helps to enrich work for elderly or impaired persons in production. Proceedings of EICS 2012 (pp. 313–316).Google Scholar
  22. Lewis, C., Wardrip-Fruin, N., & Whitehead, J. (2012). Motivational game design patterns of ‘ville games. Proceedings of FDG 2012 (pp. 172–179).Google Scholar
  23. Luce, R. D., & Tukey, J. W. (1964). Simultaneous conjoint measurement: A new scale type of fundamental measurement. Journal of Mathematical Psychology, 1(1), 1–27. doi: 10.1016/0022-2496(64)90015-X.CrossRefGoogle Scholar
  24. Malinoff, B. (2010). Sexy questions, dangerous answers. Proceedings of CASRO 2010 Technology Conference.Google Scholar
  25. Mason, A., Michalakidis, G., & Krause, P. (2012). Tiger nation: Empowering citizen scientists. Proceedings of IEEE DEST.Google Scholar
  26. Mekler, D., Brühlmann, F., Opwis, K., & Tuch, N. (2013). Disassembling gamification: The effects of points and meaning on user motivation and performance. CHI 2013.Google Scholar
  27. Nikkila, S., Byrne, D., Sundaram, H., Kelliher, A., & Linn, S. (2013). Taskville: Visualizing tasks and raising awareness in the workplace. CHI 2013 Ext. Abstracts. Google Scholar
  28. O’Brien, H. L., & Toms, E. G. (2010). The development and evaluation of a survey to measure user engagement. Journal of the American Society for Information Science and Technology, 61(1), 50–69.CrossRefGoogle Scholar
  29. Orme, B. (2010). Getting started with conjoint analysis: Strategies for product design and pricing research (2nd ed.). Glendale, CA: Research Publishers LLC.Google Scholar
  30. Porter, S., & Whitcomb, M. (2003). The impact of contact type on web survey response rates. Public Opinion Quarterly, 67(4), 579–588.CrossRefGoogle Scholar
  31. Puleston, J., & Sleep, D. (2011). The game experiments: Researching how gaming techniques can be used to improve the quality of feedback from online research. Proceedings of ESOMAR Congress.Google Scholar
  32. Ray, N. M., & Tabor, S. W. (2003). Cyber surveys come of age. Marketing Research, 15, 32–37.Google Scholar
  33. Reeves, B., & Read, J. (2009). Total engagement: Using games and virtual worlds to change the way people work and businesses compete. Boston: Harvard Business Press.Google Scholar
  34. Research through gaming: Pimple crisis. Retrieved from
  35. Research through gaming: The Playspondent House. Retrieved from
  36. Ryan, R. M., Rigby, C. S., & Przybylski, A. (2006). The motivational pull of video games: A self-determination theory approach. Motivation and Emotion, 30(4), 347–363.CrossRefGoogle Scholar
  37. Sawtooth Software Inc. (1993). The CBC system for choice-based conjoint analysis—Version 8. Sawtooth software technical paper series Google Scholar
  38. Shneiderman, B. (2004). Designing for fun: How can we design user interfaces to be more fun? Interactions, 11(5), 48–50.CrossRefGoogle Scholar
  39. Sweetser, P., & Wyeth, P. (2005). GameFlow: A model for evaluating player enjoyment in games. Computers in Entertainment, 3(3), 1–24.CrossRefGoogle Scholar
  40. Thomas, R., Bremer, J., Terhanian, G., & Couper, M. P. (2007). Truth in measurement: Comparing web-based interview techniques. Proceedings of ESOMAR Congress.Google Scholar
  41. Von Ahn, L., & Dabbish, L. (2008). Designing games with a purpose. Communications of the ACM, 51(8), 58–67.Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Briana Brownell
    • 1
    • 2
  • Jared Cechanowicz
    • 2
  • Carl Gutwin
    • 2
  1. 1.Insightrix ResearchSaskatoonCanada
  2. 2.University of SaskatchewanSaskatoonCanada

Personalised recommendations