Introduction

The fall of Troy is symbolic for the catastrophe in the ancient world: This war was the apocalyptic scenario for the battle of power and economics, for moral values and not only for the most beautiful woman of Antiquity, Helena. It is likely that, in the years and decades to come, the ongoing development of AI technology will be of significance to older adults’ quality of life. (AI) Technology in daily living spaces is one of the “big stories” for a better future of older persons and the society. Assistive technologies such as walking and visual aids have been in use for decades to improve individual’s functioning and independence (World Health Organization 2018). Various discursive voices have noted a potentially double-edged impact of AI technologies, underlining their potential to either enable predictive, personalized, preventive, and participatory healthcare for older individuals or act as disruptive technologies, depersonalizing care via standardization and contributing to discrimination, dehumanization, and disciplining (Rubeis 2020). The opportunities associated with these technologies seem to come with equal risks. Digital technologies constitute a potential way of addressing the rise in numbers of older adults needing support and assistance in the wake of societal demographic change (Ho 2020; European Commission et al. 2021).Footnote 1 In recent years, intelligent and other data-processing assistive technologies have become increasingly important in health-related settings. Going forward, these assistance systems will prospectively find use not only in inpatient healthcare, but also in the home environment (“daily living spaces”) (Haque et al. 2020; Martinez-Martin et al. 2021).

Data-processing assistive technologies have specific characteristics that conventional, non-data-processing assistive technologies do not have. These include continuous processing of “potential symptoms” (such as vital signs and movement) using tools such as sensors or wearables and the subsequent algorithm-based evaluation and interpretation of this data, partly by means of AI. The development and implementation of data-processing assistive technologies in this area may support older adults to live independently in their familiar home environmentsFootnote 2 for a longer period of time and create better treatment options for acute and chronic health conditions (Al-Shaqi et al. 2016; Baig et al. 2019; Haque et al. 2020), potentially enabling comprehensive continuous “round-the-clock” health monitoring. Alongside these prospective benefits, however, stand concerns about privacy, data processing bias, accountability, and other issues (Martinez-Martin et al. 2021; Murphy et al. 2021). In the context of ethical concerns, the impact on older adults’ relationships, such as social participation and loneliness, are also of relevance when considering the use of technology (Remmers 2019). Hine et al. (2022) make a distinction between “AI ethics” and “care ethics” when considering ethical issues related to technology, especially to AI, in older adults’ living spaces. In so doing, they point to the likely impact on interpersonal relationships from the use of these systems in healthcare. In reference to existing guidelines (Floridi and Cowls 2019; High-Level Expert Group on Artificial Intelligence 2019) that might support the ethically conscious design of these technologies, the authors group ethical issues such as “privacy and security as design priorities” and the “requirement for representative training data” as pertaining to “AI ethics” (Hine et al. 2022, Table 1). Less evidently amenable to practical resolution, but nonetheless of great significance, are ethical issues such as “preservation of human contact and [a] sense of home” and “tensions between reassurance and concern”, among others, which the authors categorize as relating to “care ethics” (Hine et al. 2022, Table 1).

Currently, the use of assistive technologies in older adults’ living spaces is at lower level than might have been expected (Weber 2021). This notwithstanding, AI is expected to evolve into a widespread, “quasi-natural” component of home technologies in the future. We conducted an empirical study on this topic among older adults, some of whom indicated feelings of fear and skepticism toward the ongoing technological revolution. Might the voices of these persons be unheeded prophecies like those of Cassandra? Could (AI) technology in our living rooms—or those of our parents and grandparents—act as “Trojan horses” for commercialization and a loss of self-determination rather than giving older adults greater freedom and autonomy and thus improving their quality of life?

In our theoretical approach to locating the tipping point, we commence by exploring coercion, one way in which another may override an individual’s self-determination, as an example which points to the existence of a threshold at which autonomy may pass into its opposite. We will go on to look at the concept of autonomy, specifically relational autonomy. In doing so, we highlight as one characteristic of the tipping point that the transition from self-determination to external determination occurs within a person. The debate around autonomy in terms of conflicting principles and issues engaging autonomy in interpersonal terms, such as paternalism, do not necessarily require us to incorporate an individual’s perspective; the “tipping point”, by contrast, is what we define as the overriding of self-determination within an individual. Having set out these considerations, we will present a psychological theory as one of several conceivable approaches to exploring the tipping point between self-determination and determination by an external entity. We chose this theory because it seems most appropriate to describe a continuum and the tipping point on an individual level. In our view, the theory can also help to develop an awareness that the tipping point is variable—not only from person to person, but also within an individual—and that it is possible to promote individuals’ self-determination via specific interventions. Concluding, we provide illustrative examples to argue that an external point of view alone does not suffice to identify the tipping point and discuss the normative implications of our findings.

Ethics of the “tipping point”

All people—and older adults are obviously no exception—differ in their autonomous health-related choices. Their right to respect for these choices is absolute, even where they may not be ideally conducive to health. This represents a normative conflict with potential for infringements on people’s autonomy. In an important opinion addressing this normative tension, the German Ethics Council has set out a concept it calls “benevolent coercion”: “‘Coercion’ denotes the overriding of another person’s will. Coercion is called ‘benevolent’ if it is performed with the intention of preventing the recipient from causing harm to herself [or himself]” (Deutscher Ethikrat 2020, p. 8).Footnote 3 How can we determine when a person’s will is overridden? Would it be conceivable, in some future scenario, that an individual may decide to override their own will? It may be the case that an individual sets out an advance health directive whether and which data-processing assistive technologies they wish to use in a future scenario of care needs. A person could override their future will by own choice if this decision should still hold when their capacity to make decisions for themselves has become limited due to, for example, severe pain or neurodegenerative disease. However, an unintentional overriding of one’s own will could be more problematic if the technological developments are more rapid than anticipated and the technologies that will actually be available in the future no longer coincide with one’s own ideas about the technologies desired in the future.

The idea of a will being “overridden” indicates that, rather than there being a binary opposition between self-determination and control by an external entity, there is a point of transition, within a person, from exercising self-determination to have lost or surrendered it. To the best of our knowledge, this tipping point within an individual remains underdescribed and lacks a thorough conceptualization, in contrast to a relative abundance of concepts that describe the overriding of another person’s will within interpersonal relationships.

Autonomy: a relational view

Describing the complexity of older adult’s decision-making process around technology use, Peek et al. (2019) attempt to identify diverse factors influencing these decisions, including life circumstances, characteristics of the technology, and the individual’s health status; interpersonal influences are among these factors. Where one individual in an interpersonal relationship is vulnerable there is a risk of paternalism, a well-known phenomenon that Beauchamp and Childress (2013) define as “the intentional overriding of one person’s preferences or actions by another person, where the person who overrides justifies this action by appeal to the goal of benefiting or of preventing or mitigating harm to the person whose preferences or actions are overridden” (Beauchamp and Childress 2013, p. 208). Paternalistic interventions can be in conflict with respect for the autonomy of the individual, a principle of bioethics alongside nonmaleficence, beneficence, and justice (Beauchamp and Childress 2013). The two authors state that the influence of others may restrict the voluntary character that is key to informed decision-making, which in turn is an integral part of autonomy (Beauchamp and Childress 2013, p. 122ff.). There is some evidence that older adults may be subject to paternalistic influence by others in their daily lives. In a survey conducted in 2017 among 3416 respondents aged 65–85, with a sample representative of the German population at large, 6.8% said that they often felt patronized by others; the finding in a similar survey run in 2014, among 4421 respondents aged 65–85, was 6.3%.Footnote 4 An older individual’s relationships have an influential impact on their decision-making about and during the use of technologies, including AI. An interview study on older adults’ hypothetic use of technology in their living spaces identified a risk that older adults may be “talked into” using these technologies by relatives (Berridge and Wetle 2020). Adult children of older persons living at home, interviewed for this study, underestimated their older relative’s ability to understand the technologies or did not consider it important for their relative to fully comprehend them (Berridge and Wetle 2020). Curnow et al. (2021) used cluster analysis of a sample of 451 persons with dementia to examine which characteristics influence the installation of assistive technologies. They found a greater influence of living situation (alone or with others) and level of caregiver support than of the severity of cognitive impairment or the risk of incidents. The fact that assistive technologies for locating the user found more frequent use with individuals who did not live alone led the authors to surmise that the technologies’ use may have the purpose of reducing caregiver anxiety or be indicative of a difference in view between the individual and their caregiver on the former’s needs. Other reasons suggested by the authors are that caregivers respond to technology alerts and that caregivers might assist the technology assessment and selection process (Curnow et al. 2021). It may therefore be the case that either the caregiver fails to respect the will of the person concerned or that they have a better understanding of that person than they do themselves and is acting in their best interests. Following the reasoning of Sullivan and Niker (2018), interpersonal influences can be thought of as “acting for the benefit of another person in a way that takes that person’s autonomous agency into account, despite no explicit expression of consent or assent being given by the person on whose behalf the decision is made”. The authors distinguish this concept of what they call “maternalism”, with its consideration of “autonomous agency”, from paternalism (Sullivan and Niker 2018).

Leading into a similar concept, researchers have provided a corrective to the potential for an excessively individualistic interpretation of autonomy by accentuating the relationality of the individual. A particular interest in this area appears in work by researchers in the fields of feminists ethics (Mackenzie 2008; Stoljar 2011) and of the ethics of care (Gilligan 1982; Conradi and Vosman 2016). The relational understanding of autonomy—and its curtailment—that emerges here highlights the fact that other people always have an influence on the values and preferences espoused by each of us (Dove et al. 2017). Some positions taken by proponents of relational approaches to autonomy seek to establish a counter-position to traditional bioethical conceptions (Donchin 2001; Osuji 2018). Osuji (2018), a proponent of the concept of relational autonomy, affirms that people develop and express their autonomy in relation to others, and therefore that a relational understanding of autonomy may enable us to perceive decision-making as dynamic and processual in character. Others, by contrast, largely regard relationality simply as an emphasis on factors such as interdependence or interconnectedness in addition to principles conventionally stressed in bioethics (Gómez-Vírseda et al. 2019).

When assistive technologies, including “smart” ones, are used in the living spaces of older adults, interpersonal relationships may expand or restrict a person’s autonomy. Paternalistic influences may limit a person’s autonomy; this restriction may, however, have the concomitant, balancing benefit of people who are close to the individual providing support for decision-making and technology use. The specificity of relational autonomy can help us to describe the various influencing factors that flow into relationships and to understand that interpersonal influences may result in the individual acquiring an altered perception of their own self-determination. But even if we fully understand the interpersonal influences, it seems that we need to take a closer look at what is going on inside a person to identify and describe the “tipping point” on the continuum between self-determination and external determination. In our view, additional theoretical considerations, which we will outline below, can further enhance our vantage point in this regard.

Self-determination theory: a potential route to exploring the “tipping point”

Various approaches to exploring this transition are conceivable. We have chosen to use the self-determination theory (SDT) proposed by Ryan and Deci (2000, 2017). There are two key reasons for our choice. First, as an empirical theory, SDT observes a continuum between self-determination (“autonomous motivation”) and non-self-determination (“controlled motivation”) (Ryan and Deci 2000, 2017). Second, as a motivational theory, SDT has the potential to show us routes toward promoting health and self-determination in the use of assistive technologies at home. In addition to these factors, SDT points to potential ways to postponing an assumed tipping point, thus, promoting an individual’s self-determination.

SDT describes a motivational continuum between amotivation, extrinsic motivation and intrinsic motivation. While intrinsic motivation (“out of pleasure”) is always self-determined, extrinsic motivation can either be of self-determined regulation (“integrated regulation = congruence, awareness, synthesis with self”; “identified regulation = personal importance, conscious valuing”) or of non-self-determined origin (“introjected regulation = self-control, ego-involvement, internal rewards and punishments”; “external regulation = compliance, external rewards and punishment”) (Ryan and Deci 2000). The continuum between non-self-determined (“controlled motivation”) and self-determined (“autonomous motivation”) is modifiable insofar as an internalization of extrinsically motivated behavior takes place (Ryan and Deci 2000). SDT sees this internalization as occurring with increasing degrees of fulfilment of three basic psychological needs, relatedness, competence, and autonomy (Ryan and Deci 2000).

We know only of isolated empirical studies that address the motivational continuum of self-determined and non-self-determined regulation in the 65-plus population. Philippe and Vallerand (2008), studying a sample of nursing home residents aged over 65, found that environments supportive of autonomy, such as those with a flexible approach to visiting hours and mealtimes, were positively correlated with self-determined motivation and with psychological adaptability. In our view it would be possible to draw on findings such as these to formulate assumptions as to how assistive technology may help promote older adults’ self-determination. Examples might be the use of such technologies to create greater flexibility in daily routines, or to provide people with an experience of competence in managing their lives, or to make it easier for them to access the world of others. The authors of the initial work on combining SDT with technology design (Peters et al. 2018) target and quantify key psychological needs (relatedness, competence, autonomy) at different stages of technology use (adoption, tasks, behavior among others). In line with SDT, Peters et al. (2018) suggest that the fulfillment of these psychological needs, via, for instance, personalization of technologies for the promotion of autonomy and competence, may have desirable outcomes, including engagement and satisfaction, in the individuals concerned. There is a need for further empirical research to validate the approach proposed by Peters et al. (2018) and to study its application to technology development in older adults.

Like any other health-related behavior, people’s choice to use data-processing assistive technologies for health-related support in their daily living spaces always involves motivational factors (for a heuristic of motivation and healthy aging in general, see Freund et al. 2021). Recent meta-analyses examining the associations between SDT interventions and health-related behaviors have focused predominantly on younger study populations (Gillison et al. 2019; Sheeran et al. 2020). When tested for their effectiveness within a meta-analysis, SDT-based interventions were found to have a positive impact on health behaviors only when the individuals concerned had self-determined motivation or a self-perception as competent, but not when their motivation was non-self-determined motivation (Sheeran et al. 2020).

An empirical evidence base exists for the association between self-determined motivation and health-related outcomes (Deci and Ryan 2017; Gillison et al. 2019; Sheeran et al. 2020), but in the older adult population, particularly in the context of technology, the literature is scarce. The principal purpose of SDT in the context of our considerations is as a theoretical lens.

Ethical considerations on the “tipping point”

In the theoretical context set above, we will now illustrate the point of transition from self-determination to external determination using two illustrative examples from a qualitative interview study we conducted in Germany with research ethics committee approval by the Friedrich-Alexander-Universität Erlangen-Nürnberg. The study focused on the perspectives of older adults around the use of data-processing assistive technologies to support home-based care. The core of the interview was a hypothetical scenario consisting of three assistive technologies (an electronic medication box, a sensor for measuring vital functions, a platform/device for connecting caregivers). Methodologically, the research took place within an explorative qualitative interview study design. We performed content analysis on the qualitative data collected, following the method proposed by Kuckartz (2018, chapter 5). We drew up a system of categories in accordance with the principles of biomedical ethics (Beauchamp and Childress 2013), with subcategories representing the subjective views of the sample of older adults (N = 16, aged between 65 and 85). Compared to representative surveys of the general German population aged 65–85, the descriptive analysis showed higher internet use and technology affinity in the study population (study population compared to the representative sample of the survey by Generali Deutschland AG (2017), figures 4.17 and 4.21), and interviewees more often rated their standard of living as very good.Footnote 5 The details of the qualitative interview study are published elsewhere (Sonnauer 2021).

What follows will use excerpts from two interviewsFootnote 6 to explore the moment of transition—the “tipping point”—between self-determination and external determination. The theoretical considerations of the tipping point go beyond the empirical significance of the interview study, but the interview excerpts may help to illustrate theoretical considerations. Excerpts 1 and 2 seem particularly appropriate to describe the tipping point. In addition to these two selected excerpts, Table 1 contains paraphrased excerpts from other interviews. The interview excerpts (N = 15; one interviewee did not give consent for transcription) were taken from a selected part of the interview in which we asked people to assess the extent to which they felt the scenario we presented to them could influence their self-determination or individual freedom. For the sake of compactness, we have chosen to paraphrase and restrict it to this specific part of the interview for presentation in Table 1. The second column of Table 1 contains a keyword list of potential determinants on perceived self-determination. Thus, for interviewees 1–5, it might be of particular importance for perceived self-determination in the use of assistive technologies whether decision-making autonomy is ensured. However, for interviewees 6–8, it might be of particular importance for the perceived self-determination in the use of assistive technologies how the technology is designed or whether sufficient technological competence is available, and so forth. The third column shows our suggestions as to how it might be possible to move the tipping point for this determinant to a later point, thus, promoting an individual’s experience of self-determination or ensuring they do not experience themselves as controlled by an external entity. Two interview excerpts could not be matched to the table.Footnote 7 Table 1 illustrates the theoretical considerations of the present work; further empirical research is needed for validation.

Table 1 Interview excerpts and tipping point considerations

The excerpts from one of these interviews (no. 1) illustrate the conflict referenced above between self-determination and beneficence.

From interview 1:

“So my inner attitude rejects this [scenario involving the use of assistive technologies]. Whether it is beneficial or not is another issue. But my inner attitude says no. I don’t want this.” […]

“When it comes to it, I would rather die than be constantly monitored. That’s what I’m saying now. It doesn’t mean I’m tired of living, but one could end up [feeling like this]. In a situation like that.” […]

“I would like to make my own decisions, no matter how good or bad they are.”

This example shows that some older individuals may perceive data-processing assistive technologies in their living space as a significant violation of their personal values and thus as harmful rather than beneficial. This may be the case even where “objective” welfare-related motives, such as the ability to access help more quickly in an emergency, may justify the technology’s use. Thus, the individual’s self-determination rejects the use of these technologies in accordance with their values, which seek to avoid an anticipated condition of dependency or being subject to surveillance. We may assume here that the person in question would perceive the use of these technologies in their living spaces as overriding their will.

A further interview (no. 2) provides a noteworthy instance of the influence of relationships with others on perceptions of self-determination or external determination in this context:

From interview 2:

“My son lives some distance away and he can’t come visit me all the time. […] It would be very reassuring for him if he knew it [my health situation] was under control [via the use of these assistive technologies as in the scenario]. So, if he asked me if I wanted it, I would say no at first. Until maybe I do see the point of it, or perhaps he, let’s say, forces me for my own good [to accept] that it’s better after all.”

We are struck by the ambiguity inherent in the statement “forces me for my own good”, which seems to us to encapsulate the conflict between beneficence and self-determination that arises when assistive technologies enter our living spaces.Footnote 8 We use SDT here to describe the transition between self-determination (“integrated regulation” or “identified regulation”) and non-self-determination (“external regulation” or “introjected regulation”). In relation to this example from interview 2, we could speak of “integrated regulation” if we read the son’s presumed preferences (desire for control) as a projection or reflection of the interviewee’s own values. We might speak of “identified regulation”, by contrast, if the individual’s prioritization of interpersonal compromise is the guiding motive. Although engagement in relations is often conducive to self-determination and the development of identity, it may also result in the external determined use of data-processing assistive technologies (“external regulation”). This is the case when the other person intentionally uses the interpersonal relation as a means of overriding the individual’s will. They may do so through deploying rewards, such as attention, or punishment, such as exhortations to lead a healthier, more careful lifestyle. “External regulation” could occur here if the individual seriously fears that their refusal to use data-processing assistive technology may have considerable consequences, such as a withdrawal of interpersonal interaction or support (in this case by the son). “Introjected regulation”, in this context, would describe an external determined regulation arising due to internal compulsions. An example might be increasing movement levels when using a movement sensor out of embarrassment at what would otherwise be showing.

Integration experienced in relation to others is somewhat akin to relational autonomy, as it describes the act of an individual incorporating relationships with others into their self-determined regulation. External regulation experienced in relation is somewhat akin to overcoming the will of a person, as it describes an individual’s sense of being controlled by an external entity. However, the difference is that, as in the excerpts from interview 2, we would need access to the point of view of the older person in order to have certainty as to whether the choice in this instance is an expression of self-determination (integration) or not (external regulation). It is in this context that we have sought to identify the tipping point as within a person and as variable. With technology, including AI, likely to be deployed in the living spaces of older adults, we would make three observations regarding the individual, relational, and societal implications of the tipping point.

First, we will be unable to recognize the tipping point from self-determination to external determination if we do not take account of the views of older adults themselves. While evident internal or external constraints may be easier for observers to understand, the actual point of an individual’s transition from self-determination to external determination is not amenable to “objective judgment”. The ambiguity emerging from the excerpt from interview 2 cited above indicates that it is not necessarily possible for third parties to recognize from the spoken word whether a decision is self-determined or not. Instead, it is an intrapersonal, subjective perception that determines the experience of self-determination or external determination. The assumption that there is a tipping point that cannot be determined solely by objective judgments raises an ongoing research question as to what extent individuals possess the necessary introspective ability to identify their tipping point themselves, or what kind of support would be required to do so.

Second, an older adult’s relationship may have an impact on their individual tipping point in relation to the use of technologies, including AI technologies. We would note the importance of awareness around the phenomenon of “benevolent coercion” in personal living spaces, in which interpersonal relationships might act as “gates of Troy” to allow the “gift” horse in. Conversely, a caring relationship could also allow older individuals to experience the technology’s use as self-determined and therefore beneficial. Expanding the Trojan analogy here, we might regard this as an instance of social relationships acting as Aeneas, the mythological founder of Rome, who carried an older man—his father Anchises—on his shoulders as he left burning Troy and moved toward the establishment of Rome as a new powerful civilization.

Third, societal implications arise from our conceptualization of a tipping point. We will address two different aspects of these implications: first, how an understanding of this tipping point could lead to appropriate societal interventions; second, how moving this tipping point could help older adults retain a higher degree of self-determination.

It is evident, given the nature of demographic change, that “smart” technologies are highly likely to find their way into older adults’ living spaces. We assume that it is societally desirable and normatively appropriate to strive for a situation in which older adults use and live with technology in a self-determined manner. Limitations of this demand for the creation of spaces of opportunity for the highest possible degree of self-determination lie in the fact that there may be other values, both societally and individually, that are more significant. In addition, knowledge of a tipping point may result in an imperative for self-determination, although some people may experience the relinquishment of responsibility through external determination as a relief. Future research should therefore further explore these possible adverse effects of concrete knowledge of the tipping point. If we were aware of a specific individual’s tipping point, then it will be possible to work with that individual toward moving this tipping point in the direction of a gain in self-determination. The findings detailed in Table 1 suggest that it may be possible to identify some certain factors within the “group” of older adults that are of particularly relevance to the tipping point in this context. Persons like those in Table 1 to whom decision-making autonomy is important would need instruments that support them in retaining this autonomy under all circumstances. For individuals for whom understanding these devices and systems is important, opportunities to acquire skills in using the technologies would need to be available. For the needs of others, reliable data would have to be available on the extent to which the technologies actually delay moving into a care home. Overall, the theoretical and empirical examination of the tipping point between self-determination and heteronomy could contribute to the development of societally appropriate interventions that help people move toward a gain in self-determination.

We chose to use self-determination theory as one among a number of conceivable approaches. According to theoretical considerations from SDT, a possible shift in the tipping point occurs through internalization (Ryan and Deci 2000, 2017). The authors state that internalization leads to experiencing more self-determination and less external determination. According to the SDT, this is done by fulfilling the three basic psychological needs (autonomy, competence, relatedness) (Ryan and Deci 2000, 2017). Such considerations could support the fulfillment of the basic psychological needs set out in Ryan and Deci’s work by, for example, providing alternatives to various types of assistance in old age and enabling the adoption of technology to the individual (autonomy), promoting health technology literacy among older adults (competence), or affording prevention of isolation and the promotion of social connectedness among older adults key places on the agenda of technological development (relatedness). These considerations depend largely on the approach chosen to conceptualize the tipping point, with SDT being one conceivable approach, along with others. The focus of this article has been to shed light on the existence of the tipping point theoretically and to suggest initial ways to operationalize it. Future work should also involve applied research on the tipping point and factors affecting its variability. Findings from this area might also be relevant to contexts other than the use of assistive technologies (for example, the question of whether an older person who requests assisted suicide is self-determined in their decision or whether there are factors that have shifted the tipping point towards an externally determined decision). Moreover, there may be a variability of the tipping point both along the individual lifespan and when health deteriorates (as in, for example, neurodegenerative disease or emergency situations). Having decision-making autonomy and valid regulatory instruments such as living wills and advance care planning that consider the use of smart technologies to plan one’s own future can potentially contribute to a higher experience of self-determination. This is all the more relevant given the potential for incongruent decisions between the individual and their family members regarding technology use (Berridge and Wetle 2020). The 24/7 round-the-clock monitoring made possible by “smart” assistive technologies carries the danger that incautious use of these devices may cross the line into a coercive practice that overrides consent. In our excerpts from interview 1 presented above, the person concerned would perceive such surveillance as massively interfering with their self-determination (“I would rather die than be constantly monitored.”). Other persons, on the other hand, would perhaps be willing to put up with any round-the-clock surveillance and isolation within their own living spaces to be able to stay at home for longer. Perhaps we should consider concluding “Ulysses contracts and stratagems” for our own future to regulate encroachments on personal freedom and privacy through smart technologies. In any case, a better understanding of the tipping point between self-determination and external determination and its variability could help to know the will of affected individuals and the point at which their will is overridden.

Conclusion

Before we enter a new “digital Rome”, with its potential benefits to older adults, we should pause for reflection on the technology, particularly instruments incorporating AI, in our living spaces.Footnote 9 This article sought to contribute to this reflection by conceptualizing a transition—a “tipping point”—between self-determination and external determination. With this, we have examined in more detail what the overriding of the will of a person means and differentiated between interpersonal or “objective” perceptions and intrapersonal or “subjective” perceptions. Through the theoretical lens of self-determination theory, this article has outlined some characteristics that suggest that the tipping point is within a person and is variable. In this light, it is all the more crucial to listen to the voices of older adults and take their views on the use of these technologies seriously. If one follows the argument that the tipping point is variable and thus a promotion of self-determination is possible, our exploration of the tipping point seems to meet an ethical imperative. To respect older persons’ self-determination on a broader scale, awareness of this tipping point may make targeted interventions possible to help shift this moment of transition to heteronomy to as late a stage as possible.