Review Article

Journal of General Internal Medicine

, Volume 29, Issue 1, pp 187-203

Crowdsourcing—Harnessing the Masses to Advance Health and Medicine, a Systematic Review

  • Benjamin L. RanardAffiliated withPerelman School of Medicine, University of Pennsylvania
  • , Yoonhee P. HaAffiliated withPerelman School of Medicine, University of Pennsylvania
  • , Zachary F. MeiselAffiliated withThe Leonard Davis Institute of Health Economics, University of PennsylvaniaDepartment of Emergency Medicine, University of Pennsylvania
  • , David A. AschAffiliated withPenn Medicine Center for Innovation, University of PennsylvaniaThe Leonard Davis Institute of Health Economics, University of PennsylvaniaPhiladelphia Veterans Affairs Medical CenterThe Wharton School, University of Pennsylvania
  • , Shawndra S. HillAffiliated withThe Wharton School, University of Pennsylvania
  • , Lance B. BeckerAffiliated withDepartment of Emergency Medicine, University of Pennsylvania
  • , Anne K. SeymourAffiliated withUniversity of Pennsylvania Libraries, University of Pennsylvania
  • , Raina M. MerchantAffiliated withPenn Medicine Center for Innovation, University of PennsylvaniaThe Leonard Davis Institute of Health Economics, University of PennsylvaniaDepartment of Emergency Medicine, University of Pennsylvania Email author 

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Objective

Crowdsourcing research allows investigators to engage thousands of people to provide either data or data analysis. However, prior work has not documented the use of crowdsourcing in health and medical research. We sought to systematically review the literature to describe the scope of crowdsourcing in health research and to create a taxonomy to characterize past uses of this methodology for health and medical research.

Data sources

PubMed, Embase, and CINAHL through March 2013.

Study eligibility criteria

Primary peer-reviewed literature that used crowdsourcing for health research.

Study appraisal and synthesis methods

Two authors independently screened studies and abstracted data, including demographics of the crowd engaged and approaches to crowdsourcing.

Results

Twenty-one health-related studies utilizing crowdsourcing met eligibility criteria. Four distinct types of crowdsourcing tasks were identified: problem solving, data processing, surveillance/monitoring, and surveying. These studies collectively engaged a crowd of >136,395 people, yet few studies reported demographics of the crowd. Only one (5 %) reported age, sex, and race statistics, and seven (33 %) reported at least one of these descriptors. Most reports included data on crowdsourcing logistics such as the length of crowdsourcing (n = 18, 86 %) and time to complete crowdsourcing task (n = 15, 71 %). All articles (n = 21, 100 %) reported employing some method for validating or improving the quality of data reported from the crowd.

Limitations

Gray literature not searched and only a sample of online survey articles included.

Conclusions and implications of key findings

Utilizing crowdsourcing can improve the quality, cost, and speed of a research project while engaging large segments of the public and creating novel science. Standardized guidelines are needed on crowdsourcing metrics that should be collected and reported to provide clarity and comparability in methods.

KEY WORDS

crowdsourcing crowd sourcing citizen scientist citizen science human computing