Advertisement

Environmental and Resource Economics

, Volume 73, Issue 3, pp 743–758 | Cite as

Subject Pools and Deception in Agricultural and Resource Economics Experiments

  • Timothy N. CasonEmail author
  • Steven Y. Wu
Article

Abstract

The use of student subjects and deception in experiments are two controversial issues that often raise concerns among editors and reviewers, which might prevent quality research from being published in agricultural and resource economics (ARE) journals. We provide a self-contained methodological discussion of these issues. We argue that field professionals are the most appropriate subjects for questions related to policy or measurement, and students are the most appropriate subjects for scientific research questions closely tied to economic theory. Active deception, where subjects are provided with explicitly misleading information, has been avoided in the mainstream economics discipline because it can lead to a loss of experimental control, lead to subject selection bias, and impose negative externalities on other researchers. Disciplinary ARE journals may want to abide by these norms against deception to maintain credibility. Interdisciplinary ARE journals may have more flexibility, although it is important to provide guidelines to avoid too much reviewer-specific variation in standards. For ARE researchers, we suggest employing a deception-free experimental design whenever possible because we know of no field in which deception is encouraged.

Keywords

Laboratory experiments Field experiments Methodology 

JEL Classification

C90 Q10 Q30 Q50 

Notes

Acknowledgements

For helpful comments, we would like to thank (without implicating) participants at the CBEAR-MAAP and WCERE conferences, and Simanti Banerjee, Carola Grebitus, David Cooper, Guillaume Fréchette, Nick Hanley, Leah Palm-Forster, Marco Palma, Sharon Raszap, Stephanie Rosch, Matt Rousu, Christian Vossler, and two anonymous referees. Wu gratefully acknowledges financial support from USDA-NIFA HATCH project IND010580.

References

  1. Alatas V, Cameron L, Chaudhuri A, Erkal N, Gangadharan L (2009) Subject pool effects in a corruption experiment: a comparison of Indonesian public servants and Indonesian students. Exp Econ 12:113–132CrossRefGoogle Scholar
  2. Allcott H (2015) Site selection bias in program evaluation. Q J Econ 130:1117–1165CrossRefGoogle Scholar
  3. Al-Ubaydli O, List JA (2015) Do natural field experiment afford researchers more or less control than laboratory experiments? Am Econ Rev (Papers Proc) 105:266–462Google Scholar
  4. American Psychological Association (2010) Ethical Principles of Psychologists and Code of Conduct, APA 0003-066X, effective June 1, 2003, amended effective June 1, 2010. https://www.apa.org/ethics/code/principles.pdf. Accessed 24 May 2017
  5. Andersen S, Harrison GW, Lau MI, Ruström EE (2010) Preference heterogeneity in experiments: comparing the field and laboratory. J Econ Behav Organ 73:209–224CrossRefGoogle Scholar
  6. Butler JM, Vossler CA (2018) What is an unregulated and potentially misleading label worth? The case of ‘natural’-labeled groceries. Environ Resour Econ 70:545–564CrossRefGoogle Scholar
  7. Camerer CF (2015) The promise and success of lab-field generalizability in experimental economics: a critical reply to Levitt and List. In: Fréchette GR, Schotter A (eds) Handbook of experimental economic methodology. Oxford University Press, Oxford, pp 249–295CrossRefGoogle Scholar
  8. Carpenter J, Seki E (2011) Do social preferences increase productivity? Field experimental evidence from fisherman in Toyama Bay. Econ Inq 49:612–630CrossRefGoogle Scholar
  9. Cason TN, de Vries FP (2018) Dynamic efficiency in experimental emissions trading markets with investment uncertainty. Environ Resour Econ.  https://doi.org/10.1007/s10640-018-0247-7 Google Scholar
  10. Cason TN, Plott CR (2014) Misconceptions and game form recognition: challenges to theories of revealed preference and framing. J Polit Econ 122:1235–1270CrossRefGoogle Scholar
  11. Chamberlin EH (1948) An experimental imperfect market. J Polit Econ 56:95–108CrossRefGoogle Scholar
  12. Charness G, Villeval M-C (2009) Cooperation and competition in intergenerational experiments in the field and in the laboratory. Am Econ Rev 99:956–978CrossRefGoogle Scholar
  13. Colson G, Corrigan JR, Grebitus C, Loureiro ML, Rousu MC (2015) Which deceptive practice, if any, should be allowed in experimental economics research? Results from surveys of applied experimental economists and students. Am J Agric Econ 98:610–621CrossRefGoogle Scholar
  14. Cooper DJ (2014) A note on deception in economic experiments. J Wine Econ 9:111–114CrossRefGoogle Scholar
  15. Cooper DJ, Kagel J, Lo W, Liang Gu Q (1999) Gaming against managers in incentive systems: experimental results with Chinese students and Chinese managers. Am Econ Rev 89:781–804CrossRefGoogle Scholar
  16. Falk A, Heckman JJ (2009) Lab experiments are a major source of knowledge in the social sciences. Science 326:535–538CrossRefGoogle Scholar
  17. Fosgaard TR (2018) Cooperation stability: a representative sample in the lab. IFRO Working Paper No. 2018/08, University of CopenhagenGoogle Scholar
  18. Fréchette GR (2015) Laboratory experiments: professionals versus students. In: Fréchette GR, Schotter A (eds) Handbook of experimental economic methodology. Oxford University Press, Oxford, pp 360–390CrossRefGoogle Scholar
  19. Fréchette GR (2016) Experimental economics across subject populations. In: Kagel JH, Roth AE (eds) Handbook of experimental economics, vol 2. Princeton University Press, Princeton, pp 435–480Google Scholar
  20. Harrison GW, List JA (2004) Field experiments. J Econ Lit 42:1009–1055CrossRefGoogle Scholar
  21. Henrich J, Heine SJ, Norenzayan A (2010) The weirdest people in the world? Behav Brain Sci 33:61–135CrossRefGoogle Scholar
  22. Herberich DH, List JA (2012) Digging into background risk: experiments with farmers and students. Am J Agric Econ 94:457–463CrossRefGoogle Scholar
  23. Higgins N, Hellerstein D, Wallander S, Lynch L (2017) Economic experiments for policy analysis and program design: a guide for agricultural decision makers. Economic Research Report 236, United States Department of Agriculture-Economic Research ServiceGoogle Scholar
  24. Jamison J, Karlan D, Schechter L (2008) To deceive or not to deceive: the effect of deception on behavior in future laboratory experiments. J Econ Behav Organ 68:477–488CrossRefGoogle Scholar
  25. Just DR, Wu SY (2009) Experimental economics and the economics of contracts. Am J Agric Econ 91:1382–1388CrossRefGoogle Scholar
  26. Kessler JB, Vesterlund L (2015) The external validity of laboratory experiments: the misleading emphasis on quantitative effects. In: Fréchette GR, Schotter A (eds) Handbook of experimental economic methodology. Oxford University Press, Oxford, pp 390–406Google Scholar
  27. Kröll M, Rustagi D (2016) Shades of dishonesty and cheating in informal milk markets in India. SAFE Working Paper No. 134, Goethe UniversityGoogle Scholar
  28. Kuhfuss L, Préget R, Thoyer S, Hanley N (2016) Nudging farmers to enrol land into agri-environmental schemes: the role of a collective bonus. Eur Rev Agric Econ 43:609–636CrossRefGoogle Scholar
  29. Levitt SD, List JA (2007) What do laboratory experiments measuring social preferences reveal about the real world? J Econ Perspect 21:153–174CrossRefGoogle Scholar
  30. Lusk JL (2018, forthcoming) The costs and benefits of deception in economic experiments. Food PolicyGoogle Scholar
  31. Maart-Noelck SC, Musshoff O (2013) Measuring the risk attitude of decision-makers: are there differences between groups of methods and persons? Aust J Agric Resour Econ 58:336–352CrossRefGoogle Scholar
  32. Maniadis Z, Tufano F, List JA (2017) To replicate or not replicate? Exploring reproducibility in economics through the lens of a model and pilot study. Econ J 127:F209–F235CrossRefGoogle Scholar
  33. Mitra A, Moore MR (2018) Green electricity markets as mechanisms of public-goods provision: theory and experimental evidence. Environ Resour Econ.  https://doi.org/10.1007/s10640-017-0136-5 Google Scholar
  34. Nichols L, Brako L, Rivera SM, Tahmassian A, Jones MF, Pierce HH, Bierer BE (2017) What do revised U.S. rules mean for human research? Science 357:650–651CrossRefGoogle Scholar
  35. Offerman T (2015) Discussion of ‘Psychology and economics: areas of convergence and divergence. In: Fréchette GR, Schotter A (eds) Handbook of experimental economic methodology. Oxford University Press, Oxford, pp 200–204CrossRefGoogle Scholar
  36. Ortmann A (2018, forthcoming) Deception. In: Schram A, Ule A (eds) Handbook of research methods and applications in experimental economicsGoogle Scholar
  37. Ortmann A, Hertwig R (2002) The costs of deception: evidence from psychology. Exp Econ 5:111–131CrossRefGoogle Scholar
  38. Palacios-Huerta I, Volij O (2008) Experientia Docet: professionals play minimax in laboratory experiments. Econometrica 76:71–115CrossRefGoogle Scholar
  39. Roe BE (2015) The risk attitudes of U.S. farmers. Appl Econ Perspect Policy 37:553–574CrossRefGoogle Scholar
  40. Roth AE (2001) Form and function in experimental design. Behav Brain Sci 24:427–428Google Scholar
  41. Rousu MC, Colson G, Corrigan JR, Grebitus C, Loureiro ML (2015) Deception in experiments: towards guidelines on use in applied economics research. Appl Econ Perspect Policy 37:524–536CrossRefGoogle Scholar
  42. Safarzynska K (2018) The impact of resource uncertainty and intergroup conflict on harvesting in the common-pool resource experiment. Environ Resour Econ.  https://doi.org/10.1007/s10640-017-0193-9 Google Scholar
  43. Smith VL (1962) An experimental study of competitive market behavior. J Polit Econ 70:111–137CrossRefGoogle Scholar
  44. Smith VL (1982) Microeconomic systems as an experimental science. Am Econ Rev 72:923–955Google Scholar
  45. Suter JF, Vossler CA (2013) Towards an understanding of the performance of ambient tax mechanisms in the field: evidence from upstate New York dairy farmers. Am J Agric Econ 96:92–107CrossRefGoogle Scholar
  46. Svorencik A (2015) The experimental turn in economics: a history of experimental economics. Ph.D. dissertation, University of UtrechtGoogle Scholar
  47. Tyler TR, Amodio DM (2015) Psychology and economics: areas of convergence and difference. In: Fréchette GR, Schotter A (eds) Handbook of experimental economic methodology. Oxford University Press, Oxford, pp 181–196CrossRefGoogle Scholar
  48. Wilson BJ (2016) The meaning of Deceive in experimental economic science. In: DeMartino G, McCloskey D (eds) Oxford handbook of professional economic ethics. Oxford University Press, OxfordGoogle Scholar
  49. Wooders J (2010) Does experience teach? Professionals and minimax play in the lab. Econometrica 78:1143–1154CrossRefGoogle Scholar
  50. Zizzo DJ (2010) Experimenter demand effects in economic experiments. Exp Econ 13:75–98CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.Department of Economics, Krannert School of ManagementPurdue UniversityWest LafayetteUSA
  2. 2.Agricultural EconomicsPurdue UniversityWest LafayetteUSA

Personalised recommendations