To Trust or Not to Trust

Six Recommendations for System Feedback in a Dynamic Environment
  • Alexander G. Mirnig
  • Sandra Troesterer
  • Elke Beck
  • Manfred Tscheligi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8742)


In today’s rapidly developing Internet, the web sites and services end users see are more and more composed of multiple services, originating from many different providers in a dynamic way. This means that it can be difficult for the user to single out individual web services or service providers and consequently judge them regarding how much they trust them. So the question is how to communicate indicators of trustworthiness and provide adequate security feedback to the user in such a situation. Contemporary literature on trust design and security feedback is mostly focused on static web services and, therefore, only partially applicable to dynamic composite web services. We conducted two consecutive studies (a qualitative and a quantitative one) to answer the questions of how and when security feedback in dynamic web service environments should be provided and how it influences the user’s trust in the system. The findings from the studies were then analyzed with regards to Riegelsberger and Sasse’s ten principles for trust design [24]. The outcome we present in this paper is an adapted list of trust principles for dynamic systems.


trust automation dynamic web services feedback design 


  1. 1.
    Ackerman, M.S., Cranor, L.F., Reagle Jr., J.: Privacy in E-Commerce: Examining User Scenarios and Privacy Preferences. In: Proc. 1st ACM Conf. on Electronic Commerce, pp. 1–8. ACM (1999)Google Scholar
  2. 2.
    Belanche, D., Casaló, L.V., Guinalíu, M.: How to make online public services trustworthy. Electronic Government, an International Journal 9(3), 291–308 (2012)CrossRefGoogle Scholar
  3. 3.
    Berendt, B., Günther, O., Spiekermann, S.: Privacy in e-commerce: stated preferences vs. actual behavior. Commun. ACM 48(4), 101–106 (2005)CrossRefGoogle Scholar
  4. 4.
    Bravo-Lillo, C., Cranor, L.F., Downs, J., Komanduri, S., Sleeper, M.: Improving computer security dialogs. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part IV. LNCS, vol. 6949, pp. 18–35. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  5. 5.
    Cooper, A., Reimann, R., Cronin, D.: About Face 3: CoThe Essentials of Interaction Design, pp. 75–108. John Wiley & Sons, Inc., New York (2007)Google Scholar
  6. 6.
    Corritore, C.L., Kracher, B., Wiedenbeck, S.: On-line trust: concepts, evolving themes, a model. Int. Journal of Human-Computer Studies 58(6), 737–758 (2003)CrossRefGoogle Scholar
  7. 7.
    Diller, S., Lin, L., Tashjian, V.: The human-computer interaction handbook, pp. 1213–1225. L. Erlbaum Associates Inc., Hillsdale (2003)Google Scholar
  8. 8.
    Dzindolet, M., Peterson, S., Plmranky, R., Pierce, L., Beck, H.: The role of trust in automation reliance. Int J. Hum.-Comput. Stud. 58(6), 697–718 (2003)CrossRefGoogle Scholar
  9. 9.
    Egelman, S., Cranor, L.F., Hong, J.: You’ve been warned: an empirical study of the effectiveness of web browser phishing warnings. In: Proc. SIGCHI Conf. on Human Factors in Computing Systems, CHI 2008, pp. 1065–1074. ACM, New York (2008)Google Scholar
  10. 10.
    Elahi, G., Yu, E.: A Goal Oriented Approach for Modeling and Analyzing Security Trade-Offs. In: Parent, C., Schewe, K.-D., Storey, V.C., Thalheim, B. (eds.) ER 2007. LNCS, vol. 4801, pp. 375–390. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  11. 11.
    Flavián, C., Guinalíu, M., Gurrea, R.: The role played by perceived usability, satisfaction and consumer trust on website loyalty. Information & Management 43(1), 1–14 (2006)CrossRefGoogle Scholar
  12. 12.
    Friedman, B., Khan Jr., P.H., Howe, D.C.: Trust online. Commun. ACM 43(12), 34–40 (2000)Google Scholar
  13. 13.
    Glass, A., McGuinness, D.L., Wolverton, M.: Toward establishing trust in adaptive agents. In: Proc. 13th Int. Conf. on Intelligent User Interfaces, IUI 2008, pp. 227–236. ACM, New York (2008)Google Scholar
  14. 14.
    Hoff, K., Bashir, M.: A theoretical model for trust in automated systems. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2013, pp. 115–120. ACM, New York (2013)Google Scholar
  15. 15.
    Johnston, J., Eloff, J., Labuschagne, L.: Security and human computer interfaces. Computers & Security 22(8), 675–684 (2003)CrossRefGoogle Scholar
  16. 16.
    Kumaraguru, P., Cranor, L.F.: Privacy indexes: A survey of Westin’s studies. ISRI Technical Report (2005)Google Scholar
  17. 17.
    Maurer, M.E., De Luca, A., Kempe, S.: Using data type based security alert dialogs to raise online security awareness. In: Proc. SOUPS 2011, pp. 2:1–2:13. ACM, NY (2011)Google Scholar
  18. 18.
    Master, R., Jiang, X., Khasawneh, M.T., Bowling, S.R., Grimes, L., Gramopadhye, A.K., Melloy, B.J.: Measurement of trust over time in hybrid inspection systems: Research articles. Hum. Factor. Ergon. Manuf. 15(2), 177–196 (2005)CrossRefGoogle Scholar
  19. 19.
    McKnight, D.H., Carter, M., Thatcher, J.B., Clay, P.F.: Trust in a specific technology: An investigation of its components and measures. ACM Trans. Manage. Inf. Syst. 2(2), 12:1–12:25 (2011)Google Scholar
  20. 20.
    Pak, R., Fink, N., Price, M., Bass, B., Sturre, L.: Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics 55(9), 1059–1072 (2012) PMID: 22799560Google Scholar
  21. 21.
    Patrick, A.S., Briggs, P., Marsh, S.: Designing systems that people will trust. In: Security and Usability. O’Reilly Media, Inc. (2005)Google Scholar
  22. 22.
    Pavlou, P.A.: Consumer acceptance of electronic commerce: Integrating trust and risk with the technology acceptance model. Int. J. Electron. Commerce 7(3), 101–134 (2003)Google Scholar
  23. 23.
    Raja, F., Hawkey, K., Hsu, S., Wang, K.L.C., Beznosov, K.: A brick wall, a locked door, and a bandit: a physical security metaphor for firewall warnings. In: Proc. SOUPS 2011, pp. 1:1–1:20. ACM, NY (2011)Google Scholar
  24. 24.
    Riegelsberger, J., Sasse, M.A.: Ignore these at your peril: Ten principles for trust design. In: 3rd International Conference on Trust and Trustworthy Computing, Trust 2010 (2010)Google Scholar
  25. 25.
    Riegelsberger, J., Sasse, M., McCarthy, J.: The mechanics of trust: A framework for research and design. International Journal of Human-Computer Studies 62(3), 381–422 (2005)CrossRefGoogle Scholar
  26. 26.
    Riegelsberger, J., Sasse, M., McCarthy, J.: The researcher’s dilemma: evaluating trust in computer-mediated communication. International Journal of Human-Computer Studies 58(6), 759–781 (2003)CrossRefGoogle Scholar
  27. 27.
    Sheehan, K.B.: Toward a typology of internet users and online privacy concerns. The Information Society, 21–32 (2002)Google Scholar
  28. 28.
    Sinha, R., Swearingen, K.: The role of transparency in recommender systems. In: CHI 2002 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2002, pp. 830–831. ACM, NY (2002)Google Scholar
  29. 29.
    Stoll, J., Tashman, C.S., Edwards, W.K., Spafford, K.: Sesame: informing user security decisions with system visualization. In: Proc. SIGCHI Conf. on Human Factors in Computing Systems, CHI 2008, pp. 1045–1054. ACM, NY (2008)Google Scholar
  30. 30.
    Raja, F., Hawkey, K., Hsu, S., Wang, K.L.C., Beznosov, K.: No choice, no trust? A Turn for the Worse: Trustbusters for User Interfaces. In: Workshop (SOUPS 2013) (2013),
  31. 31.
    Wang, L., Jamieson, G.A., Hollands, J.G.: Trust and reliance on an automated combat identification system. Human Factors: The Journal of the Human Factors and Ergonomics Society 51(3), 281–291 (2009)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2014

Authors and Affiliations

  • Alexander G. Mirnig
    • 1
  • Sandra Troesterer
    • 1
  • Elke Beck
    • 1
  • Manfred Tscheligi
    • 1
  1. 1.University of SalzburgSalzburgAustria

Personalised recommendations