Web Projects Evaluation Using the Method of Significant Website Assessment Criteria Detection

  • Paweł Ziemba
  • Jarosław Jankowski
  • Jarosław Wątróbski
  • Mateusz Piwowarski
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9655)

Abstract

The research presented in the article consists of an examination of the applicability of feature selection methods in the task of selecting website assessment criteria, to which weights are assigned. The applicability of the chosen methods was examined against the approach in which the weightings of website assessment criteria are defined by users. The research shows a selection procedure concerning significant choice criteria and reveals undisclosed user preferences based on the website quality assessment models. Results concerning undisclosed preferences were verified through a comparison with those declared by website users.

Keywords

Website evaluation quality User experience Feature selection 

References

  1. 1.
    Kim, S., Stoel, L.: Dimensional hierarchy of retail website quality. Inf. Manag. 41, 619–633 (2004)CrossRefGoogle Scholar
  2. 2.
    Jankowski, J.: Analysis of multiplayer platform users activity based on the virtual and real time dimension. In: Datta, A., Shulman, S., Zheng, B., Lin, S.-D., Sun, A., Lim, E.-P. (eds.) SocInfo 2011. LNCS, vol. 6984, pp. 312–315. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  3. 3.
    Chiou, W.C., Lin, C.C., Perng, C.: A strategic framework for website evaluation based on a review of the literature from 1995–2006. Inf. Manag. 47, 282–290 (2010)CrossRefGoogle Scholar
  4. 4.
    Grigoroudis, E., Litos, C., Moustakis, V.A., Politis, Y., Tsironis, L.: The assessment of user-perceived web quality: application of a satisfaction benchmarking approach. Eur. J. Oper. Res. 187, 1346–1357 (2008)CrossRefMATHGoogle Scholar
  5. 5.
    Barnes, S.J., Vidgen, R.: The eQual approach to the assessment of e-commerce quality: a longitudinal study of internet bookstories. In: Suh, W. (ed.) Web Engineering: Principles and Techniques, pp. 161–181. Idea Group Publishing, Hershey (2005)CrossRefGoogle Scholar
  6. 6.
    Ahn, T., Ryu, S., Han, I.: The impact of Web quality and playfulness on user acceptance of online retailing. Inf. Manag. 44, 263–275 (2007)CrossRefGoogle Scholar
  7. 7.
    Webb, H.W., Webb, L.A.: SiteQual: an integrated measure of Web site quality. J. Enterp. Inf. Manag. 17, 430–440 (2004)CrossRefGoogle Scholar
  8. 8.
    Yang, Z., Cai, S., Zhou, Z., Zhou, N.: Development and validation of an instrument to measure user perceived service quality of information presenting Web Portals. Inf. Manag. 42, 575–589 (2005)CrossRefGoogle Scholar
  9. 9.
    Elling, S., Lentz, L., de Jong, M., van den Bergh, H.: Measuring the quality of governmental websites in a controlled versus an online setting with the ‘Website Evaluation Questionnaire’. Gov. Inf. Quart. 29, 383–393 (2012)CrossRefGoogle Scholar
  10. 10.
    Holzinger, A.: Usability engineering methods for software developers. Commun. ACM 48, 71–74 (2005)CrossRefGoogle Scholar
  11. 11.
    Jankowski, J.: Integration of collective knowledge in Fuzzy models supporting Web design process. In: Jędrzejowicz, P., Nguyen, N.T., Hoang, K. (eds.) ICCCI 2011, Part II. LNCS, vol. 6923, pp. 395–404. Springer, Heidelberg (2011)Google Scholar
  12. 12.
    Ziemba, P., Piwowarski, M., Jankowski, J., Wątróbski, J.: Method of criteria selection and weights calculation in the process of Web projects evaluation. In: Hwang, D., Jung, J.J., Nguyen, N.-T. (eds.) ICCCI 2014. LNCS, vol. 8733, pp. 684–693. Springer, Heidelberg (2014)Google Scholar
  13. 13.
    Chou, W.C., Cheng, Y.: A hybrid Fuzzy MCDM approach for evaluating website quality of professional accounting firms. Expert Syst. Appl. 39, 2783–2793 (2012)CrossRefGoogle Scholar
  14. 14.
    ISO/IEC 25010:2010(E): Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — System and software quality modelsGoogle Scholar
  15. 15.
    Sorum, H., Andersen, K.N., Clemmensen, T.: Website quality in government: exploring the webmaster’s perception and explanation of website quality. Transforming Gov. People Process Policy 7, 322–341 (2013)CrossRefGoogle Scholar
  16. 16.
    Kaya, T.: Multi-attribute evaluation of website quality in e-business using an integrated Fuzzy AHPTOPSIS methodology. Int. J. Comput. Intell. Syst. 3, 301–314 (2010)CrossRefGoogle Scholar
  17. 17.
    Albert, B., Tullis, T., Tedesco, D.: Beyond The Usability Lab, Conducting Large-Scale Online User Experience Studies. Morgan Kaufmann, Burlington (2010)Google Scholar
  18. 18.
    Rubin, J., Chisnell, D.: Handbook of Usability Testing, How to Plan, Design, and Conduct Effective Tests, 2nd edn. Wiley, Indianapolis (2008)Google Scholar
  19. 19.
    Nielsen, J.: Usability Engineering. Morgan Kaufmann, San Francisco (1993)MATHGoogle Scholar
  20. 20.
    Nielsen, J.: Usability 101: Introduction to Usability. Jakob Nielsen’s Alertbox, 4 January 2012. http://www.nngroup.com/articles/usability-101-introduction-to-usability/
  21. 21.
    ISO 9126-1:2001(E): Software engineering – Product quality – Part 1: Quality modelGoogle Scholar
  22. 22.
    Hasan, L., Abuelrub, E.: Assessing the quality of web sites. Appl. Comput. Inform. 9, 11–29 (2011)CrossRefGoogle Scholar
  23. 23.
    Yang, Z., Cai, S., Zhou, Z., Zhou, N.: Development and validation of an instrument to measure user perceived service quality of information presenting Web Portals. Inf. Manag. 42, 575–589 (2005)CrossRefGoogle Scholar
  24. 24.
    Chmielarz, W.: Quality assessment of selected bookselling websites. Pol. J. Manag. Stud. 1, 127–146 (2010)Google Scholar
  25. 25.
    Lin, H.F.: An application of fuzzy AHP for evaluating course website quality. Comput. Educ. 54, 877–888 (2010)CrossRefGoogle Scholar
  26. 26.
    Ho, C., Lee, Y.: The development of an e-travel service quality scale. Tour. Manag. 28, 1434–1449 (2007)CrossRefGoogle Scholar
  27. 27.
    Ou, C.X., Sia, C.L.: Consumer trust and distrust: an issue of website design. Int. J. Hum. Comput. Stud. 68, 913–934 (2010)CrossRefGoogle Scholar
  28. 28.
    Hwang, J., Yoon, Y.S., Park, N.H.: Structural effects of cognitive and affective responses to web advertisements, website and brand attitudes, and purchase intentions: the case of casual-dining restaurants. Int. J. Hospitality Manag. 30, 897–907 (2011)CrossRefGoogle Scholar
  29. 29.
    Yang, Q., Shao, J., Scholz, M., Plant, C.: Feature selection methods for characterizing and classifying adaptive Sustainable Flood Retention Basins. Water Res. 45, 993–1004 (2011)CrossRefGoogle Scholar
  30. 30.
    Zenebe, A., Zhou, L., Norcio, A.F.: User preferences discovery using Fuzzy models. Fuzzy Sets Syst. 161, 3044–3063 (2010)CrossRefMathSciNetGoogle Scholar
  31. 31.
    Ziemba, P., Piwowarski, M.: Procedure for selecting significant website quality evaluation criteria based on feature selection methods. Stud. Proc. Pol. Assoc. Knowl. Manag. 67, 119–133 (2013)Google Scholar
  32. 32.
    Ziemba, P., Piwowarski, M.: Procedure of reducing website assessment criteria and user preference analyses. Found. Comput. Decis. Sci. 36(3–4), 315–325 (2011)Google Scholar
  33. 33.
    Chizi, B., Maimon, O.: Dimension reduction and feature selection. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook, pp. 83–100. Springer, New York (2010)Google Scholar
  34. 34.
    Guyon, I.: Practical feature selection: from correlation to causality. In: Fogelman-Soulié, F., Perrotta, D., Piskorski, J., Steinberger, R. (eds.) Mining massive data sets for security: advances in data mining, search, social networks and text mining, and their applications to security, pp. 27–43. IOS Press, Amsterdam (2008)Google Scholar
  35. 35.
    Hand, D., Mannila, H., Smyth, D.: Eksploracja danych, pp. 414–416. WNT, Warszawa (2005)Google Scholar
  36. 36.
    Witten, I.H., Frank, E.: Data Mining. Practical Machine Learning Tools and Techniques, pp. 288–295. Morgan Kaufmann, San Francisco (2005)Google Scholar
  37. 37.
    Hall, M.A., Holmes, G.: Benchmarking attribute selection techniques for discrete class data mining. IEEE Trans. Knowl. Data Eng. 15, 1437–1447 (2003)CrossRefGoogle Scholar
  38. 38.
    Fu, H., Xiao, Z., Dellandréa, E., Dou, W., Chen, L.: Image categorization using ESFS: a new embedded feature selection method based on SFS. In: Blanc-Talon, J., Philips, W., Popescu, D., Scheunders, P. (eds.) ACIVS 2009. LNCS, vol. 5807, pp. 288–299. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  39. 39.
    Hsu, H.H., Hsieh, C.W., Lu, M.D.: Hybrid feature selection by combining filters and wrappers. Expert Syst. Appl. 38, 8144–8150 (2011)CrossRefGoogle Scholar
  40. 40.
    Chang, C.C.: Generalized iterative RELIEF for supervised distance metric learning. Pattern Recogn. 43, 2971–2981 (2010)CrossRefMATHGoogle Scholar
  41. 41.
    Kononenko, I., Hong, S.J.: Attribute selection for modelling. Future Gener. Comput. Syst. 13, 181–195 (1997)CrossRefGoogle Scholar
  42. 42.
    Liu, H., Yu, L., Motoda, H.: Feature extraction, selection, and construction. In: Ye, N. (ed.) The Handbook of Data Mining, pp. 409–424. Lawrence Erlbaum Associates, Mahwah (2003)Google Scholar
  43. 43.
    Ahmad, A., Dey, L.: A feature selection technique for classificatory analysis. Pattern Recogn. Lett. 26, 43–56 (2005)CrossRefGoogle Scholar
  44. 44.
    Yu, L., Liu, H.: Feature selection for high-dimensional data: a fast correlation-based filter solution. In: Proceedings of the 20th International Conference on Machine Leaning (ICML 2003), pp. 856–863 (2003)Google Scholar
  45. 45.
    Hall, M.A.: Correlation-based feature selection for discrete and numeric class machine learning. In: Proceedings of the 17th International Conference on Machine Learning (ICML 2000), pp. 359–366 (2000)Google Scholar
  46. 46.
    Hellwig, Z.: On the optimal choice of predictors. In: Gostkowski, Z. (ed.) Toward a System of Quantitative Indicators of Components of Human Resources Development, Study VI. UNESCO, Paris (1968)Google Scholar
  47. 47.
    Senthamarai Kannan, S., Ramaraj, N.: A novel hybrid feature selection via Symmetrical Uncertainty ranking based local memetic search algorithm. Knowl.-Based Syst. 23, 580–585 (2010)CrossRefGoogle Scholar
  48. 48.
    Rokach, L., Maimon, O.: Classification trees. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook, 2nd edn, pp. 149–174. Springer, New York (2010)Google Scholar
  49. 49.
    Webb, G.I.: Association rules. In: Ye, N. (ed.) The Handbook of Data Mining, pp. 25–40. Lawrence Erlbaum Associates, Mahwah (2003)Google Scholar
  50. 50.
    Rokach, L., Maimon, O.: Supervised learning. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook, 2nd edn, pp. 133–148. Springer, New York (2010)Google Scholar
  51. 51.
    Ben-David, A.: Comparison of classification accuracy using Cohen’s Weighted Kappa. Expert Syst. Appl. 34, 825–832 (2008)CrossRefGoogle Scholar
  52. 52.
    Kuchenhoff, H., Augustin, T., Kunz, A.: Partially identified prevalence estimation under misclassification using the kappa coefficient. Int. J. Approximate Reasoning 53, 1168–1182 (2012)CrossRefMATHMathSciNetGoogle Scholar
  53. 53.
    Pham-Gia, T., Hung, T.L.: The mean and median absolute deviations. Math. Comput. Model. 34, 921–936 (2001)CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Paweł Ziemba
    • 1
  • Jarosław Jankowski
    • 2
  • Jarosław Wątróbski
    • 2
  • Mateusz Piwowarski
    • 2
  1. 1.The Jacob of Paradyż University of Applied Sciences in Gorzów WielkopolskiGorzów WielkopolskiPoland
  2. 2.Faculty of Computer Science and Information TechnologyWest Pomeranian University of Technology, SzczecinSzczecinPoland

Personalised recommendations