Advertisement

Social and Community Related Themes in Ontology Evaluation: Findings from an Interview Study

  • Marzieh TalebpourEmail author
  • Martin Sykora
  • Tom Jackson
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 976)

Abstract

A deep exploration of what the term “quality” implicates in the field of ontology selection and reuse takes us much further than what the literature has mostly focused on, that is the internal characteristics of ontologies. A qualitative study with interviews of ontologists and knowledge engineers in different domains, ranging from biomedical field to manufacturing industry reveals novel social and community related themes, that have long been neglected. These themes include responsiveness of the developer team or organization, knowing and trusting the developer team, regular updates and maintenance, and many others. This paper explores such connections, arguing that community and social aspects of ontologies are generally linked to their quality. We believe that this work represents a significant contribution to the field of ontology evaluation, with the hope that the research community can further draw on these initial findings in developing relevant social quality metrics for ontology evaluation and selection.

Keywords

Ontology quality Social quality metrics Ontology reuse 

References

  1. 1.
    Shadbolt, N., Berners-Lee, T., Hall, W.: The semantic web revisited. IEEE Intell. Syst. 21(3), 96 (2006)CrossRefGoogle Scholar
  2. 2.
    El Kadiri, S., Kiritsis, D.: Ontologies in the context of product lifecycle management: state of the art literature review. Int. J. Prod. Res. 53(18), 5657–5668 (2015)CrossRefGoogle Scholar
  3. 3.
    Bürger, T., Simperl, E.: Measuring the benefits of ontologies. In: Meersman, R., Tari, Z., Herrero, P. (eds.) OTM 2008. LNCS, vol. 5333, pp. 584–594. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-88875-8_82CrossRefGoogle Scholar
  4. 4.
    Ding, Y., Foo, S.: Ontology research and development. Part 2 - a review of ontology mapping and evolving. J. Inf. Sci. 28(5), 375–388 (2002)Google Scholar
  5. 5.
    Alani, H., Brewster, C., Shadbolt, N.: Ranking ontologies with AKTiveRank. In: Cruz, I., et al. (eds.) ISWC 2006. LNCS, vol. 4273, pp. 1–15. Springer, Heidelberg (2006).  https://doi.org/10.1007/11926078_1CrossRefGoogle Scholar
  6. 6.
    Simperl, E.: Reusing ontologies on the semantic web: a feasibility study. Data Knowl. Eng. 68(10), 905–925 (2009)CrossRefGoogle Scholar
  7. 7.
    Uschold, M., et al.: Ontology reuse and application. Form. Ontol. Inf. Syst. 179, 192 (1998)Google Scholar
  8. 8.
    Presutti, V., Lodi, G., Nuzzolese, A., Gangemi, A., Peroni, S., Asprino, L.: The role of ontology design patterns in linked data projects. In: Comyn-Wattiau, I., Tanaka, K., Song, I.-Y., Yamamoto, S., Saeki, M. (eds.) ER 2016. LNCS, vol. 9974, pp. 113–121. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46397-1_9CrossRefGoogle Scholar
  9. 9.
    Annamalai, M., Sterling, L.: Guidelines for constructing reusable domain ontologies. In: 3rd Workshop on Ontologies in Agent Systems Sofitel, no. July, pp. 71–74 (2003)Google Scholar
  10. 10.
    Bontas, E.P., Mochol, M., Tolksdorf, R.: Case studies on ontology reuse. In: Proceedings of IKNOW05 International Conference on Knowledge Management, vol. 74, pp. 345–353 (2005)Google Scholar
  11. 11.
    Talebpour, M., Sykora, M.D., Jackson, T.W.: The role of community and social metrics in ontology evaluation: an interview study of ontology reuse. In: 9th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management, pp. 119–127 (2017)Google Scholar
  12. 12.
    Gómez-Pérez, A.: Some ideas and examples to evaluate ontologies. In: CAIA, p. 299 (1995)Google Scholar
  13. 13.
    Gómez-Pérez, A.: Evaluation of taxonomic knowledge in ontologies and knowledge bases (1999)Google Scholar
  14. 14.
    Hlomani, H., Stacey, D.: Approaches, methods, metrics, measures, and subjectivity in ontology evaluation: a survey. Semant. Web J. 1, 1–11 (2014)Google Scholar
  15. 15.
    Fernández, M., Cantador, I., Castells, P.: CORE: a tool for collaborative ontology reuse and evaluation. In: CEUR Workshop Proceedings, vol. 179 (2006)Google Scholar
  16. 16.
    Sabou, M., Lopez, V., Motta, E., Uren, V.: Ontology selection: ontology evaluation on the real semantic web. In: 4th International EON Workshop, Evaluation of Ontologies for the Web, EON 2006, vol. 179 (2006)Google Scholar
  17. 17.
    Suarez-Figueroa, M.C., Gómez-Pérez, A.: First attempt towards a standard glossary of ontology engineering terminology. In: 8th International Conference on Terminology and Knowledge Engineering, TKE 2008, pp. 1–15 (2008)Google Scholar
  18. 18.
    Brank, J., Mladenic, D., Grobelnik, M.: Gold standard based ontology evaluation using instance assignment. In: Proceedings of EON 2006 Workshop (2006)Google Scholar
  19. 19.
    Yu, J., Thom, J.A., Tam, A.: Requirements-oriented methodology for evaluating ontologies. Inf. Syst. 34(8), 686–711 (2009)CrossRefGoogle Scholar
  20. 20.
    Jonquet, C., Musen, M.A., Shah, N.H.: Building a biomedical ontology recommender web service. Biomed. Seman. 1, S1 (2010)CrossRefGoogle Scholar
  21. 21.
    Duque-Ramos, A., et al.: Evaluation of the OQuaRE framework for ontology quality. Expert Syst. Appl. 40(7), 2696–2703 (2013)CrossRefGoogle Scholar
  22. 22.
    Poveda-Villalón, M., Gómez-Pérez, A., Suárez-Figueroa, M.C.: Oops!(ontology pitfall scanner!): an on-line tool for ontology evaluation. Semant. Web Inf. Syst. 10(2), 7–24 (2014)CrossRefGoogle Scholar
  23. 23.
    Corcho, Ó., Gómez-Pérez, A., González-Cabero, R., Suárez-Figueroa, M.C.: ODEval: a tool for evaluating RDF(S), DAML+OIL, and OWL concept taxonomies. In: Bramer, M., Devedzic, V. (eds.) AIAI 2004. IIFIP, vol. 154, pp. 369–382. Springer, Boston, MA (2004).  https://doi.org/10.1007/1-4020-8151-0_32CrossRefGoogle Scholar
  24. 24.
    Burton-Jones, A., Storey, V.C., Sugumaran, V., Ahluwalia, P.: A semiotic metrics suite for assessing the quality of ontologies. Data Knowl. Eng. 55(1), 84–102 (2005)CrossRefGoogle Scholar
  25. 25.
    Yang, Z., Zhang, D., Ye, C.: Evaluation metrics for ontology complexity and evolution analysis. In: Proceedings - IEEE International Conference on E-Business Engineering, ICEBE 2006, no. 90204010, pp. 162–169 (2006)Google Scholar
  26. 26.
    Arpinar, I.B., Giriloganathan, K, Aleman-Meza, B.: Ontology quality by detection of conflicts in metadata. In: CEUR Workshop Proceedings, vol. 179 (2006)Google Scholar
  27. 27.
    Djedidi, R., Aufaure, M.A.: ONTO-EVOAL an ontology evolution approach guided by pattern modeling and quality evaluation. In: Proceedings of 6th International Symposium, Sofia, Bulgaria, 15–19 February 2010, FoIKS 2010, vol. 6 (2010)Google Scholar
  28. 28.
    Page, L., Brin, S., Motwani, R., Winograd, T.: The PageRank citation ranking: bringing order to the web. World Wide Web Internet Web Inf. Syst. 54(1999-66), 1–17 (1998)Google Scholar
  29. 29.
    Ding, L., Finin, T., Reddivari, P., Cost, R.S., Sachs, J.: Swoogle : a search and metadata engine for the semantic web. In: ACM Conference on Information and Knowledge Management, pp. 652–659 (2004)Google Scholar
  30. 30.
    Tartir, S., Arpinar, I.B., Sheth, A.P.: Ontological evaluation and validation. In: Poli, R., Healy, M., Kameas, A. (eds.) Theory and Applications of Ontology: Computer Applications, pp. 115–130. Springer, Dordrecht (2010).  https://doi.org/10.1007/978-90-481-8847-5_5CrossRefGoogle Scholar
  31. 31.
    Ning, H.N.H., Shihan, D.S.D.: Structure-based ontology evaluation. In: 2006 EEE International Conference on E-Business Engineering, ICEBE 2006, pp. 2–7 (2006)Google Scholar
  32. 32.
    Brank, J., Grobelnik, M., Mladenic, D.: A survey of ontology evaluation techniques. In: Conference on Data Mining and Data Warehouses, SiKDD 2005, p. 4 (2005)Google Scholar
  33. 33.
    Maiga, G., Ddembe, W.: A flexible biomedical ontology selection tool. In: Kizza, M., Lynch, K., Nath, R., Aisbett, J., Vir, P. (eds.) Strengthening the Role of ICT in Development, vol. 5, pp. 171–189. Fountain Publishers (2009)Google Scholar
  34. 34.
    Bandeira, J., Bittencourt, I.I., Espinheira, P., Isotani, S.: FOCA: a methodology for ontology evaluation, vol. 3, pp. 1–3 (2016)Google Scholar
  35. 35.
    Brewster, C., Alani, H., Dasmahapatra, S., Wilks, Y.: Data driven ontology evaluation. In: 4th International Conference on Language Resources and Evaluation, LREC 2004, p. 4 (2004)Google Scholar
  36. 36.
    Gangemi, A., Catenacci, C., Ciaramita, M., Lehmann, J.: Qood grid: a metaontology-based framework for ontology evaluation and selection. In: Proceedings of EON 2006, vol. 4011, pp. 140–154 (2006)Google Scholar
  37. 37.
    Obrst, L., Ceusters, W., Mani, I., Ray, S., Smith, B.: The evaluation of ontologies. Seman. Web, 139–158 (2007)Google Scholar
  38. 38.
    Yu, J., Thom, J.A., Tam, A.: Ontology evaluation using Wikipedia categories for browsing. In: Proceedings of the Sixteenth ACM Conference on Information and Knowledge Management, CIKM 2007, p. 223 (2007)Google Scholar
  39. 39.
    McDaniel, M., Storey, V.C., Sugumaran, V.: The role of community acceptance in assessing ontology quality. In: Métais, E., Meziane, F., Saraee, M., Sugumaran, V., Vadera, S. (eds.) NLDB 2016. LNCS, vol. 9612, pp. 24–36. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-41754-7_3CrossRefGoogle Scholar
  40. 40.
    Porzel, R., Malaka, R.: A Task-based approach for ontology evaluation. Biomed. Semant. (2004)Google Scholar
  41. 41.
    Lozano-Tello, A., Gomez-Perez, A.: ONTOMETRIC: a method to choose the appropriate ontology. Database Manag. 15(2), 1–18 (2004)CrossRefGoogle Scholar
  42. 42.
    Martínez-Romero, M., Vázquez-Naya, J.M., Pereira, J., Pazos, A.: BiOSS: a system for biomedical ontology selection. Comput. Methods Programs Biomed. 114(1), 125–140 (2014)CrossRefGoogle Scholar
  43. 43.
    Fernández, M., Overbeeke, C., Sabou, M., Motta, E.: What makes a good ontology? A case-study in fine-grained knowledge reuse. In: Gómez-Pérez, A., Yu, Y., Ding, Y. (eds.) ASWC 2009. LNCS, vol. 5926, pp. 61–75. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-10871-6_5CrossRefGoogle Scholar
  44. 44.
    Wang, X., Guo, L., Fang, J.: Automated ontology selection based on description logic. In: Proceedings of 2008 12th International Conference on Computer Supported Cooperative Work in Design, CSCWD, vol. 1, pp. 482–487 (2008)Google Scholar
  45. 45.
    Martínez-Romero, M., Jonquet, C., O’connor, M.J., Graybeal, J., Pazos, A., Musen, M.A.: NCBO ontology recommender 2.0: an enhanced approach for biomedical ontology recommendation. J. Biomed. Seman. 8(1), 21 (2017)Google Scholar
  46. 46.
    Supekar, K., Patel, C., Lee, Y.: Characterizing quality of knowledge on semantic web. In: 7th International Florida Artificial Intelligence Research Society Conference, pp. 220–228 (2004)Google Scholar
  47. 47.
    Lewen, H., d’Aquin, M.: Extending open rating systems for ontology ranking and reuse. In: Cimiano, P., Pinto, H.S. (eds.) EKAW 2010. LNCS (LNAI), vol. 6317, pp. 441–450. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-16438-5_34CrossRefGoogle Scholar
  48. 48.
    Palinkas, L.A., Horwitz, S.M., Green, C.A., Wisdom, J.P., Duan, N., Hoagwood, K.: Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Adm. Policy Ment. Health 42(5), 2–4 (2015)CrossRefGoogle Scholar
  49. 49.
    Suri, H.: Purposeful sampling in qualitative research synthesis. Qual. Res. J. 11(2), 63–75 (2011)CrossRefGoogle Scholar
  50. 50.
    Guest, G., Bunce, A., Johnson, L.: How many interviews are enough? An experiment with data saturation and variability. Field Methods 18(1), 59–82 (2006)CrossRefGoogle Scholar
  51. 51.
    Tello, A.J.: Métrica de idoneidad de ontologías. Universidad de Extremadura (2002)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Business and EconomicsLoughborough UniversityLoughboroughUK

Personalised recommendations