CyLog/Crowd4U: A Case Study of a Computing Platform for Cybernetic Dataspaces

Chapter

Abstract

This chapter presents a case study of a computing platform for cybernetic dataspaces. The core component of the platform is a language named CyLog that models humans as rational data sources to give an integrated abstraction of human/machine computation. Crowd4U is a non-commercial microtask-based platform being developed by universities. It has an engine for executing CyLog code, provides a pool of microtasks for crowdsourcing, and supports a variety of incentive and task-assignment structures. This chapter overviews CyLog and Crowd4U, and discusses the lessons learned from our experience of crowdsourcing projects with them.

Keywords

Harness 

Notes

Acknowledgements

The author is grateful to the members and collaborators of the FusionCOMP project, and the contributors who are performing microtasks on Crowd4U. Their names are listed at http://crowd4u.org. Note that the list contains only the names of contributors who have accounts on Crowd4U, and there are many more anonymous contributors who perform microtasks on Crowd4U. The FusionCOMP project is partially supported by PRESTO from the Japan Science and Technology Agency, and by the Grant-in-Aid for Scientific Research (#25240012) from MEXT, Japan.

References

  1. Ceri S, Gottlob G, Tanca L (1989) What you always wanted to know about datalog (and never dared to ask). IEEE Trans Knowl Data Eng 1(1):146–166CrossRefGoogle Scholar
  2. Franklin MJ, Kossmann D, Kraska T, Ramesh S, Xin R (2011) CrowdDB: answering queries with crowdsourcing. In: SIGMOD Conference, Athens, pp 61–72Google Scholar
  3. Jain S, Parkes DC (2009) The role of game theory in human computation systems. In: KDD workshop on human computation, Paris, pp 58–61Google Scholar
  4. Kittur A, Smus B, Kraut R (2011) CrowdForge: crowdsourcing complex work. In: CHI extended abstracts, Vancouver, pp 1801–1806Google Scholar
  5. L-Crowd: applying crowdsourcing technology to library problems. http://crowd4u.org/lcrowd/
  6. Marcus A, Wu E, Madden S, Miller RC (2011) Crowdsourced databases: query processing with people. In: CIDR 2011, Asilomar, pp 211–214Google Scholar
  7. Minder P, Bernstein A (2012) CrowdLang: a programming language for the systematic exploration of human computation systems. In: SocInfo 2012, Lausanne. Springer, Berlin/New York, pp 124–137Google Scholar
  8. Morishima A (2010) A database abstraction for data-intensive social applications. In: The 5th Korea-Japan database workshop 2010 (KJDB2010), Jeju Island, 28–29 MayGoogle Scholar
  9. Morishima A, Shinagawa N, Mochizuki S (2011) The power of integrated abstraction for data-centric human/machine computations. In: VLDS 2011 held at VLDB2011. Seattle, Please see http://www.wikicfp.com/cfp/servlet/event.showcfp?eventid=16021&copyownerid=24065
  10. Morishima A, Shinagawa N, Mitsuishi T, Aoki H, Fukusumi S (2012) CyLog/Crowd4U: a declarative platform for complex data-centric crowdsourcing. PVLDB 5(12):1918–1921Google Scholar
  11. National Diet Library. http://www.ndl.go.jp/en/index.html
  12. Parameswaran AG, Polyzotis N (2011) Answering queries using humans, algorithms and databases. In: CIDR 2011, Asilomar, pp 160–166Google Scholar
  13. Parameswaran AG, Park H, Garcia-Molina H, Polyzotis N, Widom J (2012) Deco: declarative crowdsourcing. In: CIKM 2012, Maui, pp 1203–1212Google Scholar
  14. Shoham Y (2008) Computer science and game theory. Commun ACM 51(8):74–79Google Scholar
  15. Vega-Redondo F (2003) Economics and theory of games. Cambridge University Press in England, Please see http://en.wikipedia.org/wiki/Cambridge_University_Press
  16. von Ahn L, Dabbish L (2008) Designing games with a purpose. Commun ACM 51(8):58–67Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.University of TsukubaTsukubaJapan

Personalised recommendations