Political Behavior

, Volume 29, Issue 4, pp 415–440 | Cite as

Beyond the “Narrow Data Base”: Another Convenience Sample for Experimental Research

  • Cindy D. KamEmail author
  • Jennifer R. Wilking
  • Elizabeth J. Zechmeister
Original Paper


The experimental approach has begun to permeate political science research, increasingly so in the last decade. Laboratory researchers face at least two challenges: determining who to study and how to lure them into the lab. Most experimental studies rely on student samples, yet skeptics often dismiss student samples for lack of external validity. In this article, we propose another convenience sample for laboratory research: campus staff. We report on a randomized experiment to investigate the characteristics of samples drawn from a general local population and from campus staff. We report that campus staff evidence significantly higher response rates, and we find few discernible differences between the two samples. We also investigate the second challenge facing researchers: how to lure subjects into the lab. We use evidence from three focus groups to identify ways of luring this alternative convenience sample into the lab. We analyze the impact of self-interest, social-utility, and neutral appeals on encouraging study participation, and we find that campus staff respond better to a no-nonsense approach compared to a hard-sell that promises potential policy benefits to the community or, and especially, to the self. We conclude that researchers should craft appeals with caution as they capitalize on this heretofore largely untapped reservoir for experimental research: campus employees.


Experiments External validity Turnout Subject pools Student samples 



We thank Andrea Morrison, Emerald Nguyen, Carl Palmer, Jeremy Poryes, Jennifer Ramos, Nathalie Trepo, Derek Tripp, and Whitney Wilking for research assistance. We gratefully acknowledge financial support from the UC Davis Department of Political Science Faculty-Student Collaborative Fellowship, the UC Davis Institute for Governmental Affairs, the UC Davis Senate Committee on Research, and the Claremont Graduate University Fletcher Jones Small Grant. Authors’ names are listed alphabetically.

Supplementary material


  1. Achen, Christopher H. (1992). Social psychology, demographic variables, and linear regression: Breaking the iron triangle in voting research. Political Behavior, 14, 195–211.CrossRefGoogle Scholar
  2. Aronson, Elliot, Wilson, Timothy D., & Brewer, Marilynn B. (1998). Experimentation in social psychology. In Daniel T. Gilbert, Susan T. Fiske, & Gardner Lindzey (Eds.), The handbook of social psychology (pp. 99–142). Boston: McGraw-Hill.Google Scholar
  3. Brooks, Deborah Jordan, & Geer, John G. (2007). Beyond negativity: The effects of incivility on the electorate. American Journal of Political Science, 51, 1–16.CrossRefGoogle Scholar
  4. Campbell, Donald, & Stanley, Julian C. (1963). Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin.Google Scholar
  5. Childers, Terry L., Pride, William M., & Ferrell, O. C. (1980). A reassessment of the effects of appeals on response to mail surveys. Journal of Marketing Research, 17, 365–370.CrossRefGoogle Scholar
  6. Cook, Thomas D., & Campbell, Donald T. (1979). Quasi-experimentation: Design and analysis for field settings. Boston: Houghton Mifflin.Google Scholar
  7. Delli Carpini, Michael X., & Keeter, Scott (1993). Measuring political knowledge: Putting first things first. American Journal of Political Science, 37, 1179–1206.CrossRefGoogle Scholar
  8. Dillman, Don A. (1991). The design and administration of mail surveys. Annual Reviews of Sociology, 17, 225–249.CrossRefGoogle Scholar
  9. Dillman, Don A. (2007). Mail and internet surveys: The tailored design method (2nd ed.). Hoboken, NJ: John Wiley & Sons.Google Scholar
  10. Dillman, Don A., Singer, Eleanor, Clark, Jon R., & Treat, James B. (1996). Effects of benefits appeals, mandatory appeals, and variations in statements of confidentiality on completion rates for Census questionnaires. Public Opinion Quarterly, 60, 376–389.CrossRefGoogle Scholar
  11. Dobbins, Gregory H., Lane, Irving M., & Steiner, Dirk D. (1988). A note on the role of laboratory methodologies in applied behavioural research: Don’t throw out the baby with the bath water. Journal of Organizational Behavior, 9, 281–286.CrossRefGoogle Scholar
  12. Druckman, James N. (2004). Political preference formation: Competition, deliberation, and the (ir)relevance of framing effects. American Political Science Review, 98, 671–686.Google Scholar
  13. Druckman, James N., Green, Donald P., Kuklinski, James H., & Lupia, Arthur (2006). The growth and development of experimental research in political science. American Political Science Review, 100, 627–635.Google Scholar
  14. Druckman, James N., & Nelson, Kjersten R. (2003). Framing and deliberation: How citizens’ conversations limit elite influence. American Journal of Political Science, 47, 729–745.CrossRefGoogle Scholar
  15. Fowler, James H., & Kam, Cindy D. (2006). Patience as a political virtue: Delayed gratification and turnout. Political Behavior, 28, 113–128.Google Scholar
  16. Funk, Carolyn L. (1997). Implications of political expertise in candidate trait evaluations. Political Research Quarterly, 50 , 675–697.Google Scholar
  17. Gerber, Alan S., & Green, Donald P. (2000a). The effect of a nonpartisan get-out-the-vote drive: An experimental study of leafletting. Journal of Politics, 62, 846–857.CrossRefGoogle Scholar
  18. Gerber, Alan S., & Green, Donald P. (2000b). The effects of canvassing, telephone calls, and direct mail on voter turnout: A field experiment. American Political Science Review, 94, 653–663.CrossRefGoogle Scholar
  19. Gordon, Michael E., Slade, L. A., & Schmitt, Neal (1986). The ‘science’ of the sophomore revisited: From conjecture to empiricism. The Academy of Management Review, 11, 191–207.CrossRefGoogle Scholar
  20. Gordon, Michael E., Slade, L. A., & Schmitt, Neal (1987). Student guinea pigs: Porcine predictors and particularistic phenomena. The Academy of Management Review, 12, 160–163.CrossRefGoogle Scholar
  21. Green, Donald, & Gerber, Alan (2002). Reclaiming the experimental tradition in political science. In Ira Katzelnson & Helen V. Milner (Eds.), Political science: State of the discipline (pp. 803–832). New York: Norton.Google Scholar
  22. Greenberg, Jerald (1987). The college sophomore as guinea pig: Setting the record straight. The Academy of Management Review, 12, 157–159.CrossRefGoogle Scholar
  23. Groves, Robert M., Cialdini, Robert B., & Couper, Mick P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56, 475–495.CrossRefGoogle Scholar
  24. Henrich, Joseph (2000). Does culture matter in economic behavior? Ultimatum game bargaining among the Machiguenga of the Peruvian Amazon. American Economic Review, 90, 973–979.CrossRefGoogle Scholar
  25. Henrich, Joseph, Boyd, Robert, Bowles, Samuel, & Camerer, Colin et al. (2004). In search of homo economicus: Behavior experiments in 15 small-scale societies. American Economic Review, 91, 73–78.CrossRefGoogle Scholar
  26. Houston, Michael J., & Nevin, John R. (1977). The effects of source and appeal on mail survey response patterns. Journal of Marketing Research, 14, 374–378.CrossRefGoogle Scholar
  27. Ishiyama, John T., & Hartlaub, Stephen (2002). Does the wording of syllabi affect student course assessment in introductory political science classes? PS: Political Science and Politics, 35, 567–570.Google Scholar
  28. Kahn, Kim Fridkin, & Geer, John G. (1994). Creating impressions: An experimental investigation of political advertising on television. Political Behavior, 16, 93–116.CrossRefGoogle Scholar
  29. Kam, Cindy D. (2005). Who toes the party line? Cues, values, and individual differences. Political Behavior, 27, 163–182.CrossRefGoogle Scholar
  30. King, Gary, Keohane, Robert O., & Verba, Sidney (1994). Designing social inquiry. Princeton, NJ: Princeton University.Google Scholar
  31. Kropf, Martha E., & Blair, Johnny (2005). Eliciting survey cooperation: Incentives, self-interest, and norms of cooperation. Evaluation Review, 29, 559–575.CrossRefGoogle Scholar
  32. Lupia, Arthur. (2002). New ideas in experimental political science. Political Analysis, 10, 319–324.CrossRefGoogle Scholar
  33. McDermott, Rose (2002a). Experimental methodology in political science. Political Analysis, 10, 325–342.CrossRefGoogle Scholar
  34. McDermott, Rose (2002b). Experimental methods in political science. Annual Reviews of Political Science, 5, 31–61.CrossRefGoogle Scholar
  35. McGraw, Kathleen M., & Hoekstra, Valerie (1994). Experimentation in political science: Historical trends and future directions. In Michael X. Delli Carpini, Leonie Huddy, & Robert Y. Shapiro. (Eds.), Research in Micropolitics Vol 4: New Directions in Political Psychology, Greenwich, CT: JAI Press.Google Scholar
  36. Merolla, Jennifer, Stephenson, Laura, & Zechmeister, Elizabeth J. (2007). Applying experimental methods to the study of information shortcuts in Mexico. Política y Gobierno, 14, 117–142.Google Scholar
  37. Miller, Joanne A., & Krosnick, Jon A. (2000). News media impact on the ingredients of presidential evaluations: Politically knowledgeable citizens are guided by a trusted source. American Journal of Political Science, 44, 295–309.Google Scholar
  38. Mintz, Alex, Redd, Steven B., & Vedlitz, Arnold. (2006). Can we generalize from student experiments to the real world in political science, military affairs, and international relations? Journal of Conflict Resolution, 50, 757–576.CrossRefGoogle Scholar
  39. Mintz, Alez, & Geva, Nehemia (1993). Why don’t democracies fight each other? An experimental study. Journal of Conflict Resolution, 37, 484–503.CrossRefGoogle Scholar
  40. Morton, Rebecca B., & Williams, Kenneth C. (Forthcoming). Experimentation in political science. In Janet Box-Steffensmeier, David Collier, & Henry Brady (Eds.), Oxford Handbook of Political Methodology. Google Scholar
  41. Peterson, Robert A. (2001). On the use of college students in social science research: Insights from a second-order meta-analysis. Journal of Consumer Research, 28, 450–461.CrossRefGoogle Scholar
  42. Porter, Stephen R. (2004). Raising responses: What works? New Directions for Institutional Research Spring: 5–21.Google Scholar
  43. Scott, John T., Matland, Richard E., Michelbach, Philip A., & Bornstein, Brain H. (2001). Just deserts: An experimental study of distributive justice norms. American Journal of Political Science, 45, 749–767.CrossRefGoogle Scholar
  44. Sears, David O. (1986). College sophomores in the laboratory: Influences of a narrow data base on social psychology’s view of human nature. Journal of Personality and Social Psychology, 51, 515–530.CrossRefGoogle Scholar
  45. Shafir, Eldar, Simonson, Itamar, & Tversky, Amos (1993). Reason-based choice. Cognition, 49, 11–36.CrossRefGoogle Scholar
  46. Sigelman, Lee, Sigelman, Carol K., & Bullock, David (1991). Reconsidering pocketbook voting: An experimental approach. Political Behavior, 13, 129–149.CrossRefGoogle Scholar
  47. Slade, L. Allen, & Gordon, Michael E. (1988). On the virtues of laboratory babies and student bath water: A reply to Dobbins, Lane, and Steiner. Journal of Organizational Behavior, 9, 373–376.CrossRefGoogle Scholar
  48. Stenner, Karen (2005). The authoritarian dynamic. Cambridge: Cambridge University Press.Google Scholar
  49. Transue, John E. (2007). Identity salience, identity acceptance, and racial policy attitudes: American national identity as a united force. American Journal of Political Science, 51, 78–91.CrossRefGoogle Scholar
  50. Webster, Cynthia (1997). Effects of researcher presence and appeal on response quality in hand-delivered, self-administered surveys. Journal of Business Research, 38, 105–114.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2007

Authors and Affiliations

  • Cindy D. Kam
    • 1
    Email author
  • Jennifer R. Wilking
    • 1
  • Elizabeth J. Zechmeister
    • 1
  1. 1.Department of Political ScienceUniversity of California, DavisDavisUSA

Personalised recommendations