Investigating the Amazon Mechanical Turk Market Through Tool Design

Abstract

We developed TurkBench to better understand the work of crowdworkers on the Amazon Mechanical Turk (AMT) marketplace. While we aimed to reduce the amount of invisible, unpaid work that these crowdworkers performed, we also probed the day-to-day practices of crowdworkers. Through this probe we encountered a number of previously unreported difficulties that are representative of the difficulties that crowdworkers face in both building their own tools and working on AMT. In this article, our contributions are insights into 1) a number of breakdowns that are occurring on AMT and 2) how the AMT platform is being appropriated in ways that, at the same time, mitigate some breakdowns while exacerbating others. The breakdowns that we specifically discuss in this paper, are the increasing velocity of the market (good HITs are grabbed within seconds), the high amount of flexibility that requesters can and do exercise in specifying their HITs, and the difficulty crowdworkers had in navigating the market due to the large amount of variation in how HITs were constructed by requesters. When the velocity of the market is combined with a poor search interface, a large amount in variation in how HITs are constructed, and little infrastructural support for workers, the resulting work environment can be frustrating and difficult to thrive in.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Notes

  1. 1.

    Of course, these are not fully distinct categories as information and organizational levers often have their instantiation in functionality.

  2. 2.

    http://www.turkwork.differenceengines.com

  3. 3.

    http://www.mturk-tracker.com/

  4. 4.

    http://turkernation.com/

  5. 5.

    Examples of qualifications include user based statistics like percentage of HITs accepted which is managed by Amazon, as well as the numerous qualifications assigned by requesters that range from the rather self explanatory, e.g. Real Estate Link Qualification (NEW), to the incomprehensible, e.g. Global_test_NOTAUGR_CVs.

  6. 6.

    With the exception of aspects that the Turkers could not change, e.g. location

  7. 7.

    http://www.mturklist.com/

  8. 8.

    http://www.turkalert.com/

  9. 9.

    https://greasyfork.org/en/scripts/4771-turkmaster-mturk

  10. 10.

    https://greasyfork.org/en/scripts/2002-hit-scraper-with-export

  11. 11.

    http://mturkscripts.com/

  12. 12.

    http://www.crowdflower.com/blog/2014/01/crowdflower-drops-mechanical-turk-to-ensure-the-best-results-for-its-customers

  13. 13.

    http://uberlawsuit.com/OrderDenying.pdf

References

  1. Bederson, Benjamin B.; and Alexander J. Quinn (2011). Web workers unite! Addressing challenges of online laborers. In B. Begole and W. Kellog (Eds): CHI EA ’11. Proceedings of SIGCHI Conference on Human Factors in Computing Systems Extended Abstracts. Vancouver, Canada, 7–12 May 2011. New York: ACM press, pp. 97–106.

  2. Boehner, Kristen; Janet Vertesi; Phoebe Sengers; and Paul Dourish (2007). How HCI interprets the probes. In D. Gilmore (Ed): CHI ’07. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. San Jose, California, USA, 28 April – 3 May 2007. New York: ACM press, pp. 1077–1086.

  3. Brown, Barry; Stuart Reeves; and Scott Sherwood (2011). Into the wild: Challenges and opportunities for field trial methods. In B. Begole and W. Kellog (Eds): CHI ’11. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Vancouver, Canada, 7–12 May 2011. New York: ACM press, pp. 1657–1666.

  4. Callison-Burch, Chris (2014). Crowd-Workers: Aggregating Information Across Turkers to Help Them Find Higher Paying Work. In: HCOMP ’14. AAAI Conference on Human Computation and Crowdsourcing: Works in Progress and Demonstration Abstracts. Pittsburg, USA, 3–4 November 2014. AAAI Press, pp. 8–9.

  5. Chilton, Lydia B.; John J. Horton; Robert C. Miller; and Shiri Azenkot (2010). Task search in a human computation market. In: HCOMP ’10 Proceedings of the ACM SIGKDD Workshop on Human Computation. Washington DC, USA, 25 July 2010. New York: ACM press, pp. 1–9.

  6. Felstiner, Alek (2011). Working the crowd: Employment and labor law in the crowdsourcing industry. Berkeley Journal of Employment and Labor Law, vol. 32 no. 1, pp. 143–203.

  7. Gaver, William W.; Andrew Boucher; Sarah Pennington; and Brendan Walker (2004). Cultural probes and the value of uncertainty. Interactions - Funology, vol. 11, no. 5, pp.53–56.

  8. Gupta, Neha; David Martin; Benjamin V. Hanrahan; and Jacki O’Neill (2014). Turk-life in india. In D. McDonald and P. Bjorn (Eds): GROUP 2014. Proceedings of International Conference on Supporting Group Work. Sanibel Island, Florida 9–12 November 2014. New York: ACM press, pp.1–11.

  9. Hanrahan, Benjamin V.; Jutta Willamowski; Saiganesh Swaminathan; and David Martin (2015). Turkbench: Rendering the Market for Turkers. In B. Begole and J. Kim (Eds): CHI ’15. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Seoul, South Korea, April 18–23, 2015. New York: ACM press, pp. 1613–1616.

  10. Hutchinson, Hilary; Wendy Mackay; Bo Westerlund; Benjamin B. Bederson; Allison Druin; Catherine Plaisant; Michel Beaudouin-Lafon; Stéphane Conversy; Helen Evans; Heiko Hansen; Nicolas Roussel; and Björn Eiderbäck (2003). Technology probes: Inspiring design for and with families. In P Baudisch (Ed): CHI ’13. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Paris, France, 27 April–2 May, 2013. New York: ACM press, pp. 17–24.

  11. Ipeirotis, Panos G. (2010). Analyzing the Amazon Mechanical Turk marketplace. XRDS: Crossroads, The ACM Magazine for Students, vol. 17, no. 2, New York: ACM press, pp. 16–21.

  12. Irani, Lily C.; and M. Six Silberman (2013). Turkopticon: Interrupting worker invisibility in amazon mechanical turk. In P Baudisch (Ed): CHI ’13. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Paris, France, 27 April–2 May, 2013. New York: ACM press, pp. 611–620.

  13. Khanna, Shashank; Aishwarya Ratan; James Davis; and William Thies (2010). Evaluating and Improving the Usability of Mechanical Turk for Low-Income Workers in India. In A. Dearden, J. Parihk and L. Subrimanian (Eds): DEV 2010. Proceedings of ACM Symposium on Computing for Development December 17–18, 2010, London, United Kingdom, New York: ACM press, article no. 12.

  14. Kittur, Aniket; Jeffrey V. Nickerson; Michael Bernstein; Elizabeth Gerber; Aaron Shaw; John Zimmerman; Matt Lease; and John Horton (2013). The future of crowd work. In: CSCW ’13. Proceedings of Computer Supported Cooperative Work and Social Computing. San Antonio, Tx, USA. February 23–27, 2013. New York: ACM press, pp. 1301–1318.

  15. Martin, David; Benjamin V. Hanrahan; Jacki O’Neill; and Neha Gupta (2014). Being A Turker. In M. Ringel Morris and M. Reddy (Eds): CSCW ’14. Proceedings of Computer Supported Cooperative Work and Social Computing. Baltimore US, 15-19th February 2014. New York: ACM press, pp. 224–235.

  16. O’Neill, Jacki; and David Martin (2013). Relationship-Based Business Process Crowdsourcing. In P. Kotzé, G. Marsdan, G. Lingaard, J. Wesson and M. Winckler (Eds): INTERACT ’13. Proceedings of Human-Computer Interaction, 14th IFIP TC 13 International Conference Cape Town, South Africa, September 2-6, 2013, Proceedings, Part IV, pp. 429–446.

  17. Quinn, Alexander J.; and Benjamin B. Bederson (2011). Human computation: a survey and taxonomy of a growing field. In B. Begole and W. Kellog (Eds): CHI ’11. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Vancouver, Canada, 7–12 May 2011. New York: ACM press, pp. 1403–1412.

  18. Salehi, Niloufar; Lilly Irani; Michael Bernstein; Ali Alkhatib; Eva Ogbe; and Kristy Milland (2015). We are dynamo: Overcoming stalling and friction in collective action for crowd workers. In B. Begole and J. Kim (Eds): CHI ’15. Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Seoul, South Korea, April 18–23, 2015. New York: ACM press, pp. 1621–1630.

  19. Silberman, M. Six; Joel Ross; Lilly Irani; and Bill Tomlinson (2010). Sellers’ problems in human computation markets. In: HCOMP ’10 Proceedings of the ACM SIGKDD Workshop on Human Computation. Washington DC, USA, 25 July 2010. New York: ACM press, pp. 18–21.

  20. Star, Susan Leigh; and Anselm L. Strauss (1998). Layers of silence, arenas of voice: The ecology of visible and invisible work. Computer Supported Cooperative Work (CSCW): An International Journal, vol. 8, no. 1, pp. 9–30.

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Benjamin V. Hanrahan.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hanrahan, B.V., Martin, D., Willamowski, J. et al. Investigating the Amazon Mechanical Turk Market Through Tool Design. Comput Supported Coop Work 28, 795–814 (2019). https://doi.org/10.1007/s10606-018-9312-6

Download citation

Keywords

  • Crowdsourcing
  • Amazon Mechanical Turk