Advertisement

Design of CQA Systems for Flexible and Scalable Deployment and Evaluation

  • Ivan SrbaEmail author
  • Maria Bielikova
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9671)

Abstract

Successfulness of Community Question Answering (CQA) systems on the open web (e.g. Yahoo! Answers) motivated for their utilization in new contexts (e.g. education or enterprise) and environments (e.g. inside organizations). In spite of initial research how their specifics influence design of CQA systems, many additional problems have not been addressed so far. Especially a poor flexibility and scalability which hamper: (1) CQA essential features to be employed in various settings (e.g. in different educational organizations); and (2) collaboration support methods to be effectively evaluated (e.g. in offline as well as in live experiments). In this paper, we provide design recommendations how to achieve flexible and scalable deployment and evaluation by means of a case study on educational and organizational CQA system Askalot. Its universal and configurable features allow us to deploy it at two universities as well as in MOOC system edX. In addition, by means of its experimental infrastructure, we can integrate various collaboration support methods which are loosely coupled and can be easily evaluated online as well as offline with datasets from Askalot itself or even from all CQA systems built on the top of the Stack Exchange platform.

Keywords

CQA System design Flexibility Scalability Askalot 

Notes

Acknowledgement

This work was partially supported by grants No. VG1/0646/15 and KEGA 009STU-4/2014 and it is the partial result of collaboration within the SCOPES JRP/IP, No. 160480/2015. The authors wish to thank students participating in AskEd team, who contributed to design and implementation of Askalot and made its deploy in three different settings possible.

References

  1. 1.
    Aritajati, C., Narayanan, N.H.: Facilitating students’ collaboration and learning in a question and answer system. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work Companion - CSCW 2013, pp. 101–106. ACM Press (2013)Google Scholar
  2. 2.
    Piccardi, T., Convertino, G., Zancanaro, M., Wang, J., Archambeau, C.: Towards crowd-based customer service: a mixed-initiative tool for managing Q&A sites. In: Proceedings of the 32nd ACM Conference on Human Factors in Computing Systems - CHI 2014, pp. 2725–2734. ACM Press (2014)Google Scholar
  3. 3.
    Ponzanelli, L., Bacchelli, A., Lanza, M.: Seahawk: stack overflow in the IDE. In: Proceedings of 35th International Conference on Software Engineering - ICSE 2013, pp. 1295–1298. IEEE (2013)Google Scholar
  4. 4.
    Matejka, J., Grossman, T., Fitzmaurice, G.: IP-QAT: in-product questions, answers & tips. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology - UIST 2011, pp. 175–184. ACM Press (2011)Google Scholar
  5. 5.
    Yang, D., Adamson, D., Rosé, C.P.: Question recommendation with constraints for massive open online courses. In: Proceedings of the 8th ACM Conference on Recommender Systems - RecSys 2014, pp. 49–56. ACM Press (2014)Google Scholar
  6. 6.
    Srba, I., Bieliková, M.: A comprehensive survey and classification of approaches for community question answering. ACM Trans. Web (2016, submitted)Google Scholar
  7. 7.
    Gregg, D.G.: Designing for collective intelligence. Commun. ACM 53, 134 (2010)CrossRefGoogle Scholar
  8. 8.
    Reeves, S., Sherwood, S.: Five design challenges for human computation. In: Proceedings of the 6th Nordic Conference on Human-Computer Interaction - NordiCHI 2010m p. 383 (2010)Google Scholar
  9. 9.
    Mamykina, L., Manoim, B., Mittal, M., Hripcsak, G., Hartmann, B.: Design lessons from the fastest Q&A site in the west. In: Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems - CHI 2011, pp. 2857–2866. ACM Press (2011)Google Scholar
  10. 10.
    Ortbach, K., Gaß, O., Köffer, S., Schacht, S., Walter, N., Maedche, A., Niehaves, B.: Design principles for a social question and answers site: enabling user-to-user support in organizations. In: Tremblay, M.C., VanderMeer, D., Rothenberger, M., Gupta, A., Yoon, V. (eds.) DESRIST 2014. LNCS, vol. 8463, pp. 54–68. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  11. 11.
    Srba, I., Bielikova, M.: Askalot: community question answering as a means for knowledge sharing in an educational organization. In: Proceedings of the 18th ACM Conference Companion on Computer Supported Cooperative Work and Social Computing – CSCW 2015, pp. 179–182. ACM Press (2015)Google Scholar
  12. 12.
    Williams, J.J., Heffernan, N.: A methodology for discovering how to adaptively personalize to users using experimental comparisons. In: Proceedings of the Late-Breaking Results at the 23rd Conference on User Modelling, Adaptation and Personalisation - UMAP 2015 (2015)Google Scholar
  13. 13.
    Srba, I., Grznar, M., Bielikova, M.: Utilizing non-QA data to improve questions routing for users with low QA activity in CQA. In: Proceedings of the 2015 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2015, pp. 129–136. ACM Press (2015)Google Scholar
  14. 14.
    Huna, A., Srba, I., Bielikova, M.: Exploiting content quality and question difficulty in CQA reputation systems. In: Wierzbicki, A., et al. (eds.) NetSci-X 2016. LNCS, vol. 9564, pp. 68–81. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-28361-6_6 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Faculty of Informatics and Information TechnologiesSlovak University of Technology in BratislavaBratislavaSlovakia

Personalised recommendations