Skip to main content

Knowledge and Data Processing in a Process of Website Quality Evaluation

  • Chapter
New Challenges in Computational Collective Intelligence

Part of the book series: Studies in Computational Intelligence ((SCI,volume 244))

Abstract

Different data are present in every branch of computer science, statistics, mathematics, and such, often treated as soft techniques, as human-computer interaction, usability evaluation and ergonomics. Using standard quality evaluation methods the received results obtained empirically or analytically, are not systemized and that increases the cost of output production. Website quality tests are usually characterized by a repeatability of operations, similar to the tasks defined for the offline tests with the participants. Because every person, an object of valuation, is different and his or her answers, actions and behavior during user experience tests are hard to predict. The role of analytic is to select and classify properly transcribed answers of such users and input them to a knowledge database system. We believe that it is possible to standardize and translate all data given as input and received as output during such tests into one consistent model.

Some of artificial computational intelligence methods may help and be effective to increase productiveness of processing the knowledge in a right way basing on the previously received data, by making i.e. feedback to next website analysis. In that case we propose to look closer at uncertainty management, frames and finally propose the idea of expert system enhanced by data mining and inferencing mechanism which will make possible to decide about website quality.

In this paper we generally describe quality of website interface, chosen methods of website usability evaluation, the data obtained during evaluation process and prototypes for organizing data in frames for further use in an expert system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. ASL (Applied Science Laboratories), Eye Tracking System Instructions ASL Eye-Trac 6000 Pan/Tilt Optics, EyeTracPanTiltManual.pdf, ASL Version 1.04 01/17/2006

    Google Scholar 

  2. ASL (Applied Science Laboratories), Eye Tracking System Instruction Manual Model 501 Head Mounted Optics. In: ASL 2001 (2001)

    Google Scholar 

  3. Butler, M., Lahti, E.: Lotus Notes Database Support for Usability Testing, Lotus Development Corporation (2003)

    Google Scholar 

  4. Kuniavsky, M.: Observing the User Experience – A Practitioner’s Guide to User Research. Morgan Kaufmann Publishing, San Francisco (2003)

    Google Scholar 

  5. Nielsen, J.: Alterbox: Current issues in Web Usability (2003-2007), http://www.useit.com/alertbox , ISSN 1548-5552

  6. Norman, K.L.: Levels of Automation and User Participation in Usability Testing (2004)

    Google Scholar 

  7. Raskin, J.: The Human Interface. New Directions for Designing Interactive Systems. Addison-Wesley, Boston (2000)

    Google Scholar 

  8. Traub, P.: Optimising human factors integration in system design. Engineering Management Journal Publication 6, 93–98 (1996)

    Google Scholar 

  9. New Standards in Usability Downloaded, Userfocus ltd. (downloaded, January 2009), http://www.userfocus.co.uk/articles/ISO9241_update.html

  10. W3C Recommendation, WCAG 2.0 Guidelines, W3C Organization (December 11, 2008), http://www.w3.org/TR/WCAG20/ (downloaded January 2009)

  11. Google Analytics Software, Google Inc., http://www.google.com/analytics/

  12. ergoBrowser, ergosoft laboratories, http://www.ergolabs.com/resources.htm

  13. Crazyegg software, Crazy Egg Inc., http://crazyegg.com/demo

  14. The Observer XT, Noldus Information Technology, http://www.noldus.com/human-behavior-research/products/the-observer-xt

  15. Clickmap tool, stat24, http://stat24.com/en/statistics/clickmap/

  16. gemius HeatMap Explained, GEMIUS SA (downloaded, June 2009), http://www.gemius.pl/pl/products_heatmap_about

  17. Welcome to Eyetrack III, A project of the Poynter Institute, http://www.poynterextra.org/eyetrack2004/about.htm

  18. ISO 13407, Human Centered Design, UsabilityNet (downloaded, May 2009), http://www.usabilitynet.org/tools/13407stds.htm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Sobecki, J., Żatuchin, D. (2009). Knowledge and Data Processing in a Process of Website Quality Evaluation. In: Nguyen, N.T., Katarzyniak, R.P., Janiak, A. (eds) New Challenges in Computational Collective Intelligence. Studies in Computational Intelligence, vol 244. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03958-4_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-03958-4_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-03957-7

  • Online ISBN: 978-3-642-03958-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics