Advertisement

Business & Information Systems Engineering

, Volume 58, Issue 6, pp 433–436 | Cite as

Digital Nudging

  • Markus WeinmannEmail author
  • Christoph Schneider
  • Jan vom Brocke
Open Access
Catchword

Keywords

Nudging Information systems design Human–computer interaction Online choice architecture 

1 Digital Nudging – Guiding Judgment and Decision-Making in Digital Choice Environments

Digital nudging is the use of user-interface design elements to guide people’s behavior in digital choice environments. Digital choice environments are user interfaces – such as web-based forms and ERP screens – that require people to make judgments or decisions. Humans face choices every day, but the outcome of any choice is influenced not only by rational deliberations of the available options but also by the design of the choice environment in which information is presented, which can exert a subconscious influence on the outcome. In other words, “what is chosen often depends upon how the choice is presented” (Johnson et al. 2012, p. 488) such that the “choice architecture alters people’s behavior in a predictable way” (Thaler and Sunstein 2008, p. 6). Even simple modifications of the choice environment in which options are presented can influence people’s choices and “nudge” them into behaving in particular ways. In fact, there is no neutral way to present choices. For example, Johnson and Goldstein (2003) showed that simply changing default options (from opt-in to opt-out) in the context of organ donation nearly doubled the percentage of people who consent to being organ donors.

Many choices are made in online environments. As the design of digital choice environments always (either deliberately or accidentally) influences people’s choices, understanding the effects of digital nudges in these environments can help designers lead users to the most desirable choice. For example, the mobile payment app Square nudges people into giving tips by setting the default to “tipping” so that customers must actively select a “no tipping” option if they choose not to give a tip. Using this simple nudge has raised tip amounts, especially where little or no tipping has been common (Carr 2013). These examples show that simply changing the default option affects the outcome.

2 Relevance

The increasing use of digital technologies in large areas of our private and professional lives means that people frequently make important decisions within digital choice environments. Most, if not all, online interactions – ranging from e-government to e-commerce interactions – require people to make choices.

User interfaces such as Web sites and mobile apps frequently include digital choice environments; likewise, interfaces of organizational information systems such as ERP and CRM systems are digital choice environments that predefine or influence decisions by how the system organizes and presents workflows. Since there is no neutral way to present choices, all decisions related to user-interface design influence users’ behavior (Mandel and Johnson 2002; Sunstein 2015), often regardless of the designers’ intent. A digital choice environment’s design that accidentally influences people’s choices may lead to unintended consequences; therefore, designers must understand the effects of their designs on users’ choices so they can choose whether to implement a design that nudges users deliberately or one that reduces the effects of the design on users’ choices in order to increase free will.

A key consideration when making such design decisions is the ethical implications of using nudges. While nudges should be used to help people make better choices (Thaler and Sunstein 2008), this is not always the case in practice. For example, some European low-cost air carriers present choices of non-essential options in a way that nudges customers toward purchasing these options. While these unethical nudges may lead to short term gains for the company, they may have long-term repercussions in terms of loss of goodwill, negative publicity, or even legal action. Therefore, designers must be aware of the ethical implications of nudges (see Sunstein 2015 for a discussion of nudging ethics).

3 Current Status

Research on nudging has been conducted primarily in offline contexts. Whereas traditional economic theory suggests that human behavior is rational, nudging works because people do not always behave rationally. In particular, research in psychology has demonstrated that, because of their cognitive limitations, people act in boundedly rational ways (Simon 1955), and various heuristics and biases influence their decision-making (Tversky and Kahneman 1974). Heuristics, commonly defined as simple “rules of thumb” (Hutchinson and Gigerenzer 2005, p. 98) that people use to ease their cognitive load in making judgments or decisions, can influence decision-making positively or negatively: They can be helpful in making simple, recurrent decisions by reducing the amount of information to be processed so people can focus on differentiated factors (Evans 2006), reducing mental effort (Evans 2008). On the other hand, heuristic thinking can result in cognitive biases and introduce systematic errors when making complex judgments or decisions (Tversky and Kahneman 1974) that require effortful thinking (Evans 2006). In such situations, common heuristics – such as the anchoring and adjustment heuristic (e.g., using the default values), the availability heuristic (e.g., being influenced by the vividness of events), and the representativeness heuristic (i.e., relying on stereotypes) (Tversky and Kahneman 1974) – affect the evaluation of alternatives, often leading to suboptimal decisions.

Nudges attempt either to counter or to encourage the use of heuristics by altering the choice environment to change people’s behavior. Commonly used nudges include giving incentives, providing feedback or anchors, and setting defaults (Dolan et al. 2012; Johnson et al. 2012; Michie et al. 2013; Thaler and Sunstein 2008; see Table 1).
Table 1

Selection of nudge principles, descriptions, and examples (based on Thaler et al. 2010)

Nudge principle

Description

Example

Incentive

Making incentives more salient to increase their effectiveness

Telephones that are programmed to display the running cost of phone calls

Understanding mapping

Mapping information that is difficult to evaluate to familiar evaluation schemes

Mapping megapixels to maximum printable size instead of pointing to megapixels when advertising a digital camera

Defaults

Preselecting options by setting default options

Changing defaults (from opt-in to opt-out) to increase the percentage of people who consent to being organ donors

Giving feedback

Providing users with feedback when they are doing well or making mistakes

Electronic road signs with smiling or sad faces depending on the vehicle’s speed

Expecting error

Expecting users to make errors and being as forgiving as possible

Requiring people at an ATM to retrieve the card before they receive their money in order to help them avoid forgetting the card

Structure complex choices

Listing all the attributes of all the alternatives and letting people make trade-offs when necessary

Online product configuration systems that make choices simpler by guiding users through the purchase process

In various situations, the designers of the choice environments (sometimes referred to as “choice architects”; Thaler et al. 2010, p. 1) attempt to influence people’s choices. For example, many organizations encourage people to engage in socially responsible behaviors, such as leading a healthy life (e-health; e.g., the Fitbit provides feedback on physical activity), reducing waste or energy consumption (Green IS; e.g., Nest thermostats provide feedback on energy consumption), and planning for retirement (e-finance; e.g., governments set defaults on retirement options). Likewise, many non-governmental organizations attempt to encourage people to donate funds, participate in charitable activities, or vote for particular outcomes. In an e-commerce context, Web sites often use opt-in or opt-out mechanisms to nudge users into signing up for newsletters.

4 Applications of Digital Nudging and Future Trends

4.1 Applications

By definition, digital nudging focuses on guiding the behavior of individuals, but the effects of digitally nudging individuals can extend to organizational or societal levels (see Table 2).
Table 2

Example applications of digital nudging and their effects

Use case/IS field

Nudging example/behavior change intervention

Effect on organizational or societal level

Business process management

Structuring complex input screens

Organizational

E-business and e-commerce

Displaying limited room inventory during a hotel-booking process

Organizational

E-finance and insurance

Setting defaults for frequently selected insurance plan options

Societal

E-government

Setting defaults to opt in for organ donation

Societal

E-health

Step counter app that provides feedback on activity levels

Societal

E-learning

Reminder to learners to engage with course content

Organizational and/or societal

Green IS

Smart meters to encourage energy savings

Societal

Security and privacy

Displaying the strength of selected passwords

Organizational and/or societal

Social media

Giving incentives, such as badges, for sharing or other activities

Societal

While digital nudging, as described in this article, focuses on people’s choices in digital choice environments, the concept can be applied beyond this context, as nudges in digital environments are increasingly used to influence real-world behavior. One example is the Fitbit activity monitor, where digital nudges (e.g., reminding the user to exercise, giving feedback on activity, presenting friends’ statistics) are used to nudge people into increasing their activity levels.

4.2 Future Trends

Digital nudging will have a significant impact on future information systems research and practice, particularly for design-oriented information systems research. As user interfaces will always steer people in certain directions (depending on how information is presented), information systems designers must understand the behavioral effects of interface design elements so that digital nudging does not happen at random and unintended effects do not occur. Research on digital nudging is likely to evolve into an important area of design science research, as knowledge about the behavioral effects of interface-design decisions on users’ behavior will provide valuable guidance for improved interface design. New design theories may evolve that extend knowledge from psychology and behavioral economics to digital choice environments. As research on digital nudging is still in its early stages, clarification of the theoretical mechanisms that underlie digital nudging is needed, as is the development of theoretically based design recommendations to inform research on persuasive technology (Fogg 2003), particularly the design of persuasive systems (Oinas-Kukkonen and Harjumaa 2009) like behavior-change support systems (Oinas-Kukkonen 2010).

Because of the ubiquitous digitalization of our private and professional lives, digital nudging will soon extend to other application areas as people will use digital devices to make decisions in more situations and sectors, and the devices themselves will diversify in form and function. New devices will emerge with new interaction and interface design elements, such as kinetics, virtual reality, and holograms, and designers will need to understand the potential behavioral effects of these new technologies on people’s judgment and decision-making.

We encourage our fellow scholars to engage in research on digital nudging, a fascinating area of information systems research that bears considerable potential for both research and society.

5 Further Reading

We suggest the following books for further reading on offline nudging and the underlying mechanisms: Kahneman 2011; Thaler and Sunstein 2008.

References

  1. Carr A (2013) How square register’s UI guilts you into leaving tips. http://www.fastcodesign.com/3022182/innovation-by-design/how-square-registers-ui-guilts-you-into-leaving-tips. Accessed 08 Aug 2016
  2. Dolan P, Hallsworth M, Halpern D, King D, Metcalfe R, Vlaev I (2012) Influencing behaviour: the mindspace way. J Econ Psychol 1(33):264–277CrossRefGoogle Scholar
  3. Evans JSBT (2006) The heuristic-analytic theory of reasoning: extension and evaluation. Psychon Bull Rev 3(13):378–395CrossRefGoogle Scholar
  4. Evans JSBT (2008) Dual-processing accounts of reasoning, judgment, and social cognition. Ann Rev Psychol 1(59):255–278CrossRefGoogle Scholar
  5. Fogg BJ (2003) Persuasive technology: using computers to change what we think and do. Elsevier, OxfordGoogle Scholar
  6. Hutchinson JMC, Gigerenzer G (2005) Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet. Behav Process 2(69):97–124CrossRefGoogle Scholar
  7. Johnson EJ, Goldstein D (2003) Do defaults save lives? Science 5649(302):1338–1339CrossRefGoogle Scholar
  8. Johnson EJ, Shu SB, Dellaert BGC, Fox C, Goldstein DG, Häubl G, Larrick RP, Payne JW, Peters E, Schkade D, Wansink B, Weber EU (2012) Beyond nudges: tools of a choice architecture. Marketing Lett 2(23):487–504CrossRefGoogle Scholar
  9. Kahneman D (2011) Thinking, fast and slow. Farrar, Straus and Giroux, New YorkGoogle Scholar
  10. Mandel N, Johnson EJ (2002) When web pages influence choice: effects of visual primes on experts and novices. J Consum Res 2(29):235–245CrossRefGoogle Scholar
  11. Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, Eccles MP, Cane J, Wood CE (2013) The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med 1(46):81–95CrossRefGoogle Scholar
  12. Oinas-Kukkonen H (2010) Behavior change support systems: a research model and agenda. In: Ploug T, Hasle P, Oinas-Kukkonen H (eds) Persuasive technology: lecture notes in computer science, vol 6137. Springer, Heidelberg, pp 4–14Google Scholar
  13. Oinas-Kukkonen H, Harjumaa M (2009) Persuasive systems design: key issues, process model, and system features. Commun Assoc Inf Syst 28(24):485–500Google Scholar
  14. Simon HA (1955) A behavioral model of rational choice. Q J Econ 1(69):99–118CrossRefGoogle Scholar
  15. Sunstein CR (2015) Nudging and choice architecture: ethical considerations. SSRN Electron J. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2551264. Accessed 15 Mar 2016
  16. Thaler RH, Sunstein CR (2008) Nudge: improving decisions about health, wealth, and happiness. Yale University Press, New HavenGoogle Scholar
  17. Thaler RH, Sunstein CR, Balz JP (2010) Choice architecture. SSRN Electron J. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583509. Accessed 08 Aug 2016
  18. Tversky A, Kahneman D (1974) Judgment under uncertainty: heuristics and biases. Science 4157(185):1124–1131CrossRefGoogle Scholar

Copyright information

© The Author(s) 2016

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Markus Weinmann
    • 1
    Email author
  • Christoph Schneider
    • 2
  • Jan vom Brocke
    • 1
  1. 1.Institute of Information SystemsUniversity of LiechtensteinVaduzLiechtenstein
  2. 2.Department of Information SystemsCity University of Hong KongKowloonHong Kong

Personalised recommendations