Ontology Evaluation through Usability Measures

An Experiment with the SUS Scale in the Legal Domain
  • Núria Casellas
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5872)


Current ontology methodologies offer guidance towards knowledge acquisition, ontology development (design and conceptualization), formalization, evaluation, evolution and maintenance. Nevertheless, these methodologies describe most of expert involvements within ontology validation rather vaguely. The use of tailored usability methods for ontology evaluation could offer the establishment of certain quality measurements and aid the evaluation of modelling decisions, prior ontology implementation. This paper describes the experimental evaluation of a legal ontology, the Ontology of Professional Judicial Knowledge (OPJK), with the SUS questionnaire, a usability evaluation questionnaire tailored to ontology evaluation.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Gómez-Pérez, A.: Evaluation of ontologies. International Journal of Intelligent Systems 16(3), 391–409 (2001)zbMATHCrossRefGoogle Scholar
  2. 2.
    Sure, Y.: Methodology, Tools and Case Studies for Ontology Based Knowledge Management. PhD thesis, Fakultät für Wirschaftwissenschaften der Universität Fridericiana zu Karlsruhe (2003)Google Scholar
  3. 3.
    Grüninger, M., Fox, M.: Methodology for the Design and Evaluation of Ontologies. In: Proceedings of Int. Joint Conf. AI 1995, Workshop on Basic Ontological Issues in Knowledge Sharing (1995)Google Scholar
  4. 4.
    Velardi, P., Navigli, R., Cucchiarelli, A., Neri, F.: Evaluation of OntoLearn, a methodology for automatic population of domain ontologies. In: Buitelaar, P., Cimiano, P., Magnini, B. (eds.) Ontology Learning from Text: Methods, Evaluation and Applications. Frontiers in Artificial Intelligence and Applications Series, vol. 123. IOS Press, Amsterdam (2005)Google Scholar
  5. 5.
    Sclano, F., Velardi, P.: Termextractor: a web application to learn the common terminology of interest groups and research communities. In: Proceedings of the 9th Conference on Terminology and Artificial Intelligence (TIA 2007), Sophia Antinopolis (October 2007)Google Scholar
  6. 6.
    Gómez-Pérez, A., Fernández-López, M., Corcho, O.: Ontological Engineering. In: With examples from the areas of Knowledge Management, e-Commerce and the Semantic Web. Advanced Information and Knowlege Processing. Springer, London (2003)Google Scholar
  7. 7.
    Hartmann, J., Spyns, P., Gibboin, A., Maynard, D., Cuel, R., Suárez-Figueroa, M.C., Sure, Y.: D.1.2.3. methods for ontology evaluation. Deliverable IST-2004-507482 KWEB D.1.2.3., EU-IST Network of Excellence (NoE) Knowledge Web Consortium (January 2005)Google Scholar
  8. 8.
    Brank, J., Grobelnik, M., Mladenic, D.: D.1.6.1 ontology evaluation. SEKT IST-2003-506826 Deliverable 1.6.1, SEKT, EU-IST Project Jozef Stefan Institute (June 2005)Google Scholar
  9. 9.
    Lozano-Tello, A., Gómez-Pérez, A., Sosa, E.: Selection of ontologies for the semantic web. In: Lovelle, J.M.C., Rodríguez, B.M.G., Aguilar, L.J., Gayo, J.E.L. (eds.) ICWE 2003. LNCS, vol. 2722, pp. 413–416. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  10. 10.
    Gangemi, A., Catenaccia, C., Ciaramita, M., Lehmann, J.: Qood grid: A metaontology-based framework for ontology evaluation and selection. In: Vrandečić, D., del Carmen Suárez-Figueroa, M., Gangemi, A., Sure, Y. (eds.) Proceedings of the 4th International Workshop on Evaluation of Ontologies for the Web (EON 2006) at the 15th International World Wide Web Conference (WWW 2006), Edinburgh, Scotland, May 2006, pp. 8–15 (2006)Google Scholar
  11. 11.
    Guarino, N., Welty, C.: Evaluating ontological decisions with ontoclean. Communications of the ACM 45(2), 61–65 (2002)CrossRefGoogle Scholar
  12. 12.
    Tartir, S., Arpinar, I.B., Moore, M., Sheth, A.P., Aleman-Meza, B.: OntoQA: Metric-based ontology quality analysis. In: Proceedings of IEEE Workshop on Knowledge Acquisition from Distributed, Autonomous, Semantically Heterogeneous Data and Knowledge Sources (2005)Google Scholar
  13. 13.
    Vrandečić, D., Sure, Y.: How to design better ontology metrics. In: Franconi, E., Kifer, M., May, W. (eds.) ESWC 2007. LNCS, vol. 4519, pp. 311–325. Springer, Heidelberg (2007)Google Scholar
  14. 14.
    Vrandečić, D., Gangemi, A.: Unit tests for ontologies. In: Meersman, R., Tari, Z., Herrero, P. (eds.) OTM 2006 Workshops. LNCS, vol. 4278, pp. 1012–1020. Springer, Heidelberg (2006)Google Scholar
  15. 15.
    Benjamins, V.R., Casanovas, P., Contreras, J., López-Covo, J.M., Lemus, L.: Iuriservice: An intelligent frequently asked questions system to assist newly appointed judges. In: Benjamins, V.R., Casanovas, P., Breuker, J., Gangemi, A. (eds.) Law and the Semantic Web. LNCS (LNAI), vol. 3369, pp. 201–217. Springer, Heidelberg (2005)Google Scholar
  16. 16.
    Casellas, N., Blázquez, M., Kiryakov, A., Casanovas, P., Poblet, M., Benjamins, V.R.: OPJK into PROTON: Legal domain ontology integration into an upper-level ontology. In: Meersman, R., Tari, Z., Herrero, P. (eds.) OTM-WS 2005. LNCS, vol. 3762, pp. 846–855. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  17. 17.
    Casellas, N., Casanovas, P., Vallbé, J.J., Poblet, M., Blázquez, M., Contreras, J., López-Cobo, J.M., Benjamins, V.R.: Semantic enhancement for legal information retrieval: Iuriservice performance. In: Proceedings of the Eleventh International Conference on Artificial Intelligence and Law. ICAIL 2007, Stanford Law School, California, June 4-8, pp. 49–57. Association for Computing Machinery (2007)Google Scholar
  18. 18.
    ISO: Human-centred design processes for interactive systems. ISO 13407:1999, International Organization for Standardization (1999)Google Scholar
  19. 19.
    ISO: Ergonomics of human-system interaction – usability methods supporting human-centred design. ISO Standard TR 16982:2002, International Organization for Standardization (2002)Google Scholar
  20. 20.
    ISO/IEC: Ease of operation of everyday products - part 1: Design requirements for context of use and user characteristics. ISO Standard 20282-1:2006, ISO/IEC (2006)Google Scholar
  21. 21.
    ISO/IEC: Software and system engineering - guidelines for the design and preparation of user documentation for application software. ISO Standard 18019:2004, ISO/IEC (2004)Google Scholar
  22. 22.
    Brooke, J.: Sus: A ’quick and dirty’ usability scale. In: Jordan, P.W., Thomas, B., McClelland, I.L., Weerdmeester, B. (eds.) Usability Evaluation In Industry, pp. 189–194. Taylor & Francis, London (1996)Google Scholar
  23. 23.
    Casellas, N.: Modelling Legal Knowledge Through Ontologies. OPJK: the Ontology of Professional Judicial Knowledge. PhD thesis, Faculty of Law, Universitat Autònoma de Barcelona, Barcelona (2008)Google Scholar
  24. 24.
    Casanovas, P., Casellas, N., Vallbé, J.: An ontology-based decision support system for judges. In: Casanovas, P., Breuker, J., Klein, M., Francesconi, E. (eds.) Channelling the legal information flood. Legal ontologies and the Semantic Web. Frontiers in Artificial Intelligence and Applications, vol. 188, pp. 165–176. IOS Press, Amsterdam (2009)Google Scholar
  25. 25.
    Nielsen, J.: How to conduct a heuristic evaluation. Web (2005) (Available at 01-10-2008)Google Scholar
  26. 26.
    Tullis, T.S., Stetson, J.N.: A comparison of questionnaires for assessing website usability. In: UPA 2004: Connecting Communities, Minneapolis, Minnesota, June 7-11 (2004)Google Scholar
  27. 27.
    Nielsen, J.: Usability inspection methods. In: CHI 1994: Conference companion on Human factors in computing systems, pp. 413–414. ACM, New York (1994)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Núria Casellas
    • 1
  1. 1.Institute of Law and Technology (IDT-UAB)Universitat Autònoma de BarcelonaBellaterraSpain

Personalised recommendations