Behavior Research Methods

, Volume 46, Issue 1, pp 112–130

Nonnaïveté among Amazon Mechanical Turk workers: Consequences and solutions for behavioral researchers


DOI: 10.3758/s13428-013-0365-7

Cite this article as:
Chandler, J., Mueller, P. & Paolacci, G. Behav Res (2014) 46: 112. doi:10.3758/s13428-013-0365-7


Crowdsourcing services—particularly Amazon Mechanical Turk—have made it easy for behavioral scientists to recruit research participants. However, researchers have overlooked crucial differences between crowdsourcing and traditional recruitment methods that provide unique opportunities and challenges. We show that crowdsourced workers are likely to participate across multiple related experiments and that researchers are overzealous in the exclusion of research participants. We describe how both of these problems can be avoided using advanced interface features that also allow prescreening and longitudinal data collection. Using these techniques can minimize the effects of previously ignored drawbacks and expand the scope of crowdsourcing as a tool for psychological research.


Crowdsourcing Internet research Data quality Longitudinal research Mechanical Turk MTurk 

Supplementary material

13428_2013_365_MOESM1_ESM.xlsx (42 kb)
ESM 1(XLSX 42 kb)

Copyright information

© Psychonomic Society, Inc. 2013

Authors and Affiliations

  • Jesse Chandler
    • 1
  • Pam Mueller
    • 2
  • Gabriele Paolacci
    • 3
  1. 1.Woodrow Wilson School of Public AffairsPrinceton UniversityPrincetonUSA
  2. 2.Department of PsychologyPrinceton UniversityPrincetonUSA
  3. 3.Rotterdam School of ManagementErasmus UniversityRotterdamNetherlands

Personalised recommendations