Abstract
Every new technology is used by us humans almost without hesitation. Usually the military use comes first. Examples from recent history are the use of chemical weapons by Germany in the First World War and of atomic bombs in the Second World War by the US. Now, with the rapid advances in microelectronics over the past few decades, a wave of its application, called digitization, is spreading around the world with barely any control mechanisms. In many areas this has simplified and enriched our lives, but it has also encouraged abuse. The adaptation of legislation to contain the obvious excesses of “digitization” such as hate mail and anonymous threats is lagging behind massively. We hear almost nothing about technology assessment through systematic research; it is demanded at most by a few, usually small groups in civil society, which draw attention to the threats to humankind—future and present—and the Earth's ecosystem. One such group, the Federation of German Scientists (VDW) e.V., in the spirit of the responsibility of science for the peaceful and considered application of the possibilities it creates, asked three of its study groups to jointly organize its 2019 Annual Conference. The study groups “Health in Social Change,” “Education and Digitization,” and “Technology Assessment of Digitization” formulated the following position paper for the 2019 VDW Annual Conference, entitled “Ambivalences of the Digital.”
The need for a major public debate on the ambivalences of the digital is actually coming almost too late, because the misuse of large amounts of data has already begun.
Hartmut Graßl
You have full access to this open access chapter, Download chapter PDF
Similar content being viewed by others
Introduction
The text corresponds to the position paper of the same name, which was prepared for the VDW Annual Conference 2019. The VDW study groups “Health in Societal Change,” “Education and Digitisation,” and “Technology Assessment of Digitisation” participated in preparing this statement. The ten theses presented at the end formed the basis for discussion at the annual conference.
The Federation of German Scientists (VDW) e.V. perceives the process of accelerated digitization and interconnectedness as well as the development of so-called “artificial intelligence” (AI, also: machine learning) as the potential for epochal social change. As with the development of the use of tools and the spread of language, writing, and printing, a fundamental change in the organization of social life can already be observed today as a result of digitization. These changes hold both opportunities and threats for humankind—future and present—and the Earth's ecosystem. The VDW therefore recognizes the necessity and urgency of critically examining digitization as a series of technological processes and their social prerequisites, applications, limits, and consequences.
It is part of the VDW's identity to critically examine new technologies with regard to hazard potential, and to initiate risk analyses, in addition to proposing measures to avert hazards at an early stage.
In light of the excitement, almost religious in nature, regarding digitization, interconnectedness, and AI in practically all areas of life, the VDW considers it its task to point out underestimated or ignored scientifically and socially highly relevant existential problems associated with this development and to make well-founded proposals for ethically acceptable means of handling them.
For example, the VDW considers one of the immediate dangers to be the creation of completely new, long- lasting, unpredictable, profound dependencies for individuals, institutions, and states—that only few can escape—of digitization, interconnectedness, and AI. This development has the potential to deepen existing social inequalities in societies and to intensify the global discrimination of particularly low-income population groups. As such, digitization, interconnectedness, and AI pose a sustainability risk. Whether they prove to be socially and ecologically harmful depends primarily on the results of social, political, and economic negotiation processes, struggles, and decisions.
In addition to these societal challenges, there are also technology-related risks that need to be addressed comprehensively and rapidly, regardless of the societal embeddedness of technology use. These include, among others, objective limits to the quantity of use and security issues in AI development, but also in particular the hidden manipulation of users through processing and exploitation of unmanageable amounts of data. These are used by corporations to control consumption, by political groups for disinformation, and by authoritarian states for oppression and social control.
Correlations within mass data, discovered with the help of AI, are also confused with causalities and such correlations are being used to judge people in various ways and as the basis for subsequent decision making. The available experience shows that this repeatedly leads to social disadvantages and discrimination (structurally, too) and that these misjudgments (e.g., when looking for a partner, a job, housing, or a loan) cannot be corrected or can only be corrected with great effort. Such processes can accelerate downward spirals of social exclusion.
The VDW observes with great concern that the digitization of more areas of human life questions the self-image of individuals and societies such that fundamental threats to the health, dignity, and freedom of a large part of humanity are looming and democratic societies are endangered.
Underlying Human Image
The current development of digital technologies and machine learning is based on a reductionist world view, which comes to a head in the metaphor of people as information processing systems. This image already has its origins in cybernetics and behaviorism, which understand humans merely as an organic feedback system. According to this, any behavior is only the more versus less appropriate result of a neuronal evaluation of sensory data. Characteristics such as consciousness, freedom, or the self are at best regarded as phenomena emerging from information or from the activity of the neural substrate. The question of the meaning of an individual’s being as a constitutive moment of human existence remains completely disregarded in this perspective.
Human beings thus reduced to their mechanical characteristics become deficient beings whose cognitive abilities, although considered exceptional in the animal kingdom, are ultimately inefficient due to their organic limitations and should therefore be optimized by adequate technological supplementation. For many experts involved in the development of digital technologies, AI is not just a simple extension of human cognitive competence, but rather a new branch of the evolution of intelligence itself, in which humans are finally overtaken and overcome by AI, which is becoming independent of them.
In the last consequence, human intelligence is reduced to propositional thinking, which is also postulated by representatives of this view as the only reliable basis for all decisions. Whether in road traffic or in the choice of a life partner, decisions are nothing more than neurally mediated environmental analyses that ultimately lead to true or false results. However, since such algorithmically comprehensible calculations can in principle be carried out much more efficiently and accurately by “intelligent” machines, it is not a contradiction, but rather a compelling consequence to leave decisions with far-reaching social consequences to AI in the future.
If an ultimately inhuman and misanthropic worldview remains the dominant view in the future, which guides the development and application of digital technologies and the closely associated image of AI, then digitization will become an immanent threat for humankind and possibly for organic life on Earth per se. From the VDW’s perspective, such a reduced view of humankind does not correspond to the actual nature of humankind, nor to its dignity and intrinsic value. Through their existence, humans are already the answer to the question of meaning, which they pose as conscious beings. It is not the human who must adapt to technology, but rather technology that must always remain a tool of humans, to be used for their benefit and the benefit of all creatures.
Health and Social Participation
According to the definition of the World Health Organization (WHO),Footnote 1 the VDW understands health as a bio-psycho-social construct, including self-determination and participation in the life of society. The interconnectedness of machines capable of learning can contribute to human health, but can also harm it. On the one hand, as tools, interconnected machines can be used for the coordination of care, for instance at the interface between outpatient and inpatient care, for faster diagnostics, better prosthetics, or, by reducing the workload, to creating time for more humane care, therapy, and medicine. On the other hand, initial experience shows that interconnectedness through learning machines exacerbates existing social problems. We observe monopolistic appropriation, manipulation, and advertising paternalism, division, and discrimination, and altogether a departure from the humane, as already indicated above.
Although research and development underlying such technologies are usually publicly funded, the intellectual property in the form of the equipment, algorithms, and source codes are declared private property and trade secrets. Likewise, users’ private data are used and misused by private companies and government agencies in at least immoral and often illegal ways.
In the combination of (depth) psychological findings and the use of digitalized health applications, people are seduced into unreflected behavior controlled by automats, which can cause them harm. In particular, the radical-utilitarian approach to human self-optimization, including the optimization of one’s children (already prenatally), as a suggested prerequisite for a successful and happy life, can cause illness and social damage.
Disease and death are not only biological–medical processes, but always integrated into social processes. Humans must therefore always be viewed holistically and not in a quantitative–reductionist way.
The VDW considers dangers in the fact that the tendency to turn away from the humane also has serious consequences regarding human health. If humans are increasingly adapted to the technical requirements of machines, as demonstrated for about 100 years exemplified by assembly line production, and this pressure to adapt becomes increasingly all-encompassing, then well-known tendencies to perceive “lower-functioning people” as either reparable (e.g., through rehabilitation measures), or to segregate them as avoidable defects (e.g., in the case of genetic defects) will increase.
Education and Digitization
The educational policy and educational science discourse on teaching, learning, and digitization is largely dominated by media education, IT (learning media) development, and quantitatively oriented empirical educational research. The focus is one-sidedly on the opportunities offered by the digitization of teaching media and the personalization that this makes possible, including small-scale learning assessment for learning control. The purported opportunities offered by this form of mediatized schooling are often claimed but are only partially proven or verifiable. On the other hand, by largely excluding the perspectives of other fields of research, the current discourse fails to sufficiently consider risks. Examples include the important and critical contributions from historical and philosophical educational research, media addiction research, pediatric and developmental psychological research into the effects of media, public health and prevention science, neurobiology, attachment research, the criticism of algorithm-based control systems and data exploitation economics, as well as research on the effects of non-ionizing electromagnetic radiation.
A no less problematic but less visible constriction of discourse occurs if the proposed solutions to “digital risks” do not take place on a broad level oriented on the design of a humane environment. It is highly problematic that until recently the politically discussed approaches were limited to purely technical improvements (e.g., better encryption for more protection of student data) or to the level of self-optimization of individuals in terms of teaching “digital risk avoidance skills.” This is in clear contradiction to the findings of prevention research, according to which relational prevention, i.e., the creation of healthy living spaces, can contribute more to the prevention of risk behavior than behavioral prevention, which starts with the behavior of individuals. This is even more relevant where younger target groups are concerned.
The VDW therefore sees an urgent need to finance robust, transdisciplinary, and independent technology assessments (TA) to compare different technology paths in educational institutions and to ask whether children should be introduced to technology as early as possible (“Early High Tech”) or whether, in accordance with their physical and cognitive development, real (sensorimotor) life experiences should be the initial focus of attention (“Early High Touch; High Tech later”).
The overarching control of educational processes (educational governance) at country and institution level must remain in the hands of the people and legitimized by democratic decision-making processes, and must not be delegated to Big Data-based systems, as is already largely the case in the US and some other countries.
Based on the present state of knowledge, the VDW sees an urgent need for financing the development and implementation of modern teaching concepts that are oriented on the development phases and lay the foundations for media literacy until the end of kindergarten without any digital screen media, and until the end of primary school largely without any digital screen media (e.g., through an “Analogue Pact#D”). Furthermore, it advocates the financing and implementation of modern, non-commercial concepts for secondary schools and universities for the use of digital media for teaching and learning, as well as the creation of a digital infrastructure for these facilities that cannot become a control and management technology for the users and does not generate long-term learning or personality profiles.
Economy, Labor, Society
The developments referred to as the “fourth” industrial revolution, which are characterized by accelerated digitization and interconnectedness of production, logistics, trade, and services, as well as the increasing use of AI, are once again changing the way we do business, across the entire global value chain.
This results in serious challenges for the economy, society, and state in order to make inclusive use of the opportunities for social and ecological transformation potentially contained therein and to effectively ward off the dangers of deepening social inequalities and spreading unsustainable consumption patterns. The disruptive nature of the changes increases the pressure to comprehensively identify and swiftly implement the necessary political and legal decisions and measures. From the perspective of the VDW, the focus should be on improving the enforcement of economic, social, and cultural human rights (e.g., effective protection of competition, safeguarding and further development of core labor standards, functioning solidarity systems for social security). How can potential opportunities for an improved relationship between gainful employment, social/communal involvement, and time for oneself, family, and friends be used?
Already today, the state of digitization could facilitate enormous freedom for the organization of economic and social systems. This will probably increase in the coming years. At the same time, however, it has also become apparent for some time now how the ongoing digitization in some countries is being used to create new private or state monopoly structures and to reduce civil liberties in favor of centralized state power, the suppression of foreigners and dissenters, and the promotion of economic–behavioral uniformity.
In contrast, the VDW considers an enormous welfare potential in the further advancement of mixed economic systems in which public, cooperative, and private actors (above all small- and medium-sized enterprises) provide different contributions to solving the challenges of rapid technological developments and the changing framework and contributing conditions (e.g., climate change, demographic development). In the VDW's view, a broad diversity of actors and objectives (e.g., the provision of public goods, orientation towards the common good, private-sector profit motive) has in the past proved to be rather crisis-resistant.
This welfare potential can unfold its effect in favor of social cohesion to the extent that it effectively strengthens lower and middle incomes. This results from the different savings and consumption rates of the various income groups. To the extent that productivity gains arise from the increasing digitization, interconnectedness, and use of AI, new ways must be sought to sustainably finance existing or developing social security systems. This cannot be achieved by a one-sided burden on the production factor labor (payroll taxes, etc.). Energy consumption, capital income, and assets must be used to finance this. Tax-based basic funding in combination with citizens' or workers' insurance also makes it easier to switch between the tasks people devote themselves to, such as dependent employment, self-employment, involvement in family, community, and society (e.g., support for children, young people, people in need of care, and the elderly). These systems will only be successful, however, if they also consider cross-border solidarity in order to better safeguard individual wishes and the need for mobility of employees. In the EU, for example, a first sensible step would be to introduce a common European unemployment insurance scheme. Such a fund could help to arm member states against economic crises and the high unemployment that accompanies them.
The changes for gainful employment can be used to address each person's desire to be useful in the economic system and to combine this, better than before, with the need for socially necessary work, and to finance both adequately. The question of which resources we use for which production is in principle a social one, the answer to which can be organized in different ways in the democratic process. For example, the loss of millions of jobs leads to social distortions even if a similar number or even more new jobs are created in quantitative terms, because the workers who have been rationalized away are often not qualified for the new job profiles. Without adequate social measures, which must go beyond primarily vocational training offensives, this can lead to these people feeling marginalized and becoming susceptible to anti-democratic, authoritarian propaganda and pseudo-declarations.
Regulation
In order to counteract any form of abuse of accelerated digitization and interconnectedness as well as the development of AI, clear, binding and enforced ethical and legal objectives, standards, policies, and regulations are necessary which secure and expand democratic (including civil society) control, monitoring, and participation opportunities and rights.
Essential and consequential decisions must be based on the results of technology assessments. This may also require bans to be imposed in areas where the current state of research indicates a negative risk–reward-balance. For example, the available research results demonstrate that the use of digital screen media during kindergarten and primary school age has predominantly negative consequences, so that a moratorium on the use of digital screen media by such young learners is necessary.
The Asilomar Principles for the development and use of artificial intelligence, which have become Californian law, are only a rudimentary approachFootnote 2 and must be further developed in a social and democratic process, adapted to reality, tightened, and consistently applied. For example, the transparency of algorithms must be ensured: Algorithms that decide on the life paths and life chances of people (e.g., in school, at work, in health, in justice) must be disclosed and the underlying calculation processes must be comprehensible.
Further development of existing regulations for the effective enforcement of core labor standards in changing industrial relations is just as necessary, as adjustments in competition law, in regulations for the protection of intellectual property rights, and in the improvement of social security systems (e.g., trade union protection in platform economies, taxation of data use, data protection and privacy protection, financial participation of people in the use of their data, immorality of certain data transfer agreements).
Ten Key Questions
-
1.
What image of the world and humankind do we have and promote, and what influence may technological developments have on it?
-
2.
What utopias do we aspire to in the coming decades and what role should digitization, interconnectedness, and AI play in this? For example, do we want progressively more decisions that fundamentally affect human existence to be made by machines?
-
3.
Which (ethical) maxims and limits do we want for digital extensions and alleged “improvements” of humans (prenatal genetic interventions; monitoring and control of vital functions)?
-
4.
How can and should the economy, state, and society guarantee the establishment and expansion of a non-commercial, digital infrastructure?
-
5.
How can social participation of all be ensured?
-
6.
How can the use and added value of digital media in the classroom be objectively determined and democratically evaluated?
-
7.
How can elementary cultural techniques (reading, writing, arithmetic, making music, working, drawing, etc.) and the basics of knowledge and skills (logical thinking, language skills, understanding of connections, concentration, attention, etc.) be successfully taught and thus preserved in a verifiable way?
-
8.
How can the fourth industrial revolution be used for social cohesion (also in the Global South) and ecological renewal? What alternative models for the integration of economic and social policy are possible?
-
9.
How can distortions in structural change (e.g., through massive changes in job profiles) be significantly reduced and made socially acceptable, and how can social systems be efficiently developed?
-
10.
How can we effectively combat and sustainably counteract manipulation by private and state actors and interests?
Federation of German Scientists (VDW) e.V.
Since the foundation of the Federation of German Scientists (VDW) e.V. in 1959 by prominent nuclear scientists, among them Carl Friedrich von Weizsäcker, who had previously spoken out publicly as a “Göttingen 18” signatory against nuclear armament of the German Armed Forces, the VDW has felt committed to the tradition of responsible science. At annual conferences, in interdisciplinary study and project groups, scientific publications, and public statements, it takes a stand on questions of scientific orientation, technological developments, and peace and security policy. The role of science itself is also a subject of consideration, both in the genesis and the solution of problems. Around 350 natural scientists, humanities scholars, and social scientists are organized in the VDW, and work together on current and pressing issues in an inter- and transdisciplinary manner. With the results of its work, the VDW addresses the sciences, the interested public, and decision makers at all levels of politics, society, and the economy.
In accordance with its statutes from 1959, the VDW sets itself the following goals:
– to strengthen the sense of responsibility of scientists for the impact of their research on society
– to study the problems arising from the progressive development of science and technology
– to give a public voice to science and its representatives
– to influence decisions in an advisory capacity and to oppose the misuse of scientific results
– to stand up for the freedom of research and promote the free exchange of its results.
Notes
- 1.
“Health is a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.” http://apps.who.int/gb/bd/PDF/ bd47/EN/constitution-en.pdf?ua=1 (30.10.2019). The definition is set out in the World Health Organization (WHO) declaration. It was adopted by the International Health Conference held in New York from 19 June to 22 July 1946 and signed on 22 July 1946 by the representatives of 61 states. It came into force on 7 April 1948.
- 2.
The VDW Study Group on Technology Assessment of Digitisation has published a statement on the Asilomar principles of artificial intelligence. This is available as a download on the VDW website under the following link: https://vdw-ev.de/wp-content/uploads/2018/05/Stellungnahme-SG-TA-Digitalisierung-der-VDW_April-2018.pdf (1 January 2022, German only).
Bibliography
Assis-Hassid, S., Heart, T., Reychav, I., & Pliskin, J. S. (2016). Modelling factors affecting patient–doctor–computer communication in primary care. International Journal of Reliable and Quality E-Healthcare (IJRQEH), 5(1), 17ff. https://doi.org/10.4018/IJRQEH.2016010101
Bartels, J. (2017). What’s all this silence? Computer centered communication in patient doctor computer communication. In J. Bartels (Ed.), Health professionals education in the age of health information systems, mobile computing and social networks (pp. 23–34). https://doi.org/10.1016/B978-0-12-805362-1.00002-4
Behrens, J. (2019). Theorie der Pflege und der Therapie. Hogrefe.
Behrens, J., & Langer, G. (2016). Evidence based nursing and caring: Methoden und Ethik der Pflegepraxis und Versorgungsforschung – Vertrauensbildende Entzauberung der “Wissenschaft”. Hogrefe.
Bittner, J., Dockweiler, C., & Thranberend, T. (2018). Roadmap Digitale Gesundheit Handlungsempfehlungen für eine Digitalisierung im Dienst der Gesundheit. In U. Repschläger, C. Schulte, & N. Osterkampf (Eds.), BARMER Gesundheitswesen aktuell (pp. 62–91).
Bleckmann, P., & Lankau, R. (Eds.). (2019). Digitale Medien und Unterricht. Beltz.
Bleckmann, P. (2018). Toward media literacy or media addiction? Contours of good governance for healthy childhood in the digital world. In M. Matthes, L. Pulkkinen, B. Heyes, & C. Clouder (Eds.), Improving the quality of childhood in Europe (Vol. 7, pp. 103–119). Alliance for Childhood European Network Foundation.
Bleckmann, P., & Mößle, T. (2014). Position zu Problemdimensionen und Präventionsstrategien der Bildschirmnutzung. Sucht, 60(4), 235–247.
Budzinski, O., & Schneider, S. (2017). Smart Fitness: Ökonomische Effekte einer Digitalisierung der Selbstvermessung. Institut für Volkswirtschaftslehre.
Bündnis für humane Bildung. (2017). Sieben Forderungen des Bündnisses für humane Bildung. http://www.aufwach-s-en.de/wp-content/uploads/2017/10/buendnis_forderungen.pdf
Christiaensen, L., & Demery, L. (Eds.). (2017). Agriculture in Africa – Telling myths from facts. World Bank Group.
Deutscher Ethikrat. (2017). Big Data und Gesundheit Datensouveränität als informationelle Freiheitsgestaltung. Deutscher Ethikrat.
Dockweiler, C., Kupitz, A., & Hornberg, C. (2017). Nutzerorientierung in der telemedizinischen Forschung und Entwicklung: Welche Potenziale besitzen partizipative Verfahren? In F. Duesberg (Ed.), eHealth 2017 – Informations und Kommunikationstechnologien im Gesundheitswesen (pp. 118–122). Medical Future Verlag.
Ferber, L. V., & Behrens, J. (1997). Public Health – Forschung mit Gesundheits- und Sozialdaten: Stand und Perspektiven. Memorandum zur Analyse und Nutzung von Gesundheits- und Sozialdaten. Sankt Augustin.
Förschler, A. (2018). Das ‘Who is who?’ der deutschen Bildungs-Digitalisierungsagenda – eine kritische Politiknetzwerk-Analyse. Pädagogische Korrespondenz, 58(2), 31–52.
Frankfurt-Dreieck zur Bildung in der digital vernetzten Welt. (2019). Ein interdisziplinäres Modell.
Frey, C. B., & Osborne, M. A. (2013). The future of employment: How susceptible are jobs to computerisation. Oxford University.
Göpel, Eberhard (2017). Eine Gesundheits-Wende in der Medizinforschung ist notwendig! https://vdw-ev.de/prof-dr-eberhard-goepel-eine-gesundheits-wende-in-der-medizinforschung-ist-notwendig/
Gövercin, M., Meyer, S., Schellenbach, M., Steinhagen-Thiessen, E., Weiss, B., & Haesner, M. (2016). SmartSenior@home: acceptance of an integrated ambient assisted living system. Results of a clinical field trial in 35 households. Informatics for Health and Social Care, 41, 430–447. https://doi.org/10.3109/17538157.2015.1064425
Hallward-Driemeier, M., & Nayyar, G. (2018). Trouble in the making? The future of manufacturing-led development. The World Bank Group.
Hartong, S. (2018). Wir brauchen Daten, noch mehr Daten, bessere Daten! Kritische Überlegungen zur Expansionsdynamik des Bildungsmonitorings. Pädagogische Korrespondenz, Heft, 58, 15–30.
Hastal, M. R., Dockweiler, C., & Mühlhaus, J. (2017). Achieving end user acceptance: Building blocks for an evidence based user centered framework for health technology development and assessment. In M. Antona & C. Stephanidis (Eds.), Universal access in human computer interaction: Human and technological environments (pp. 13–25). Springer.
ILO. (2017). Inception report for the global commission on the future of work. Genf.
Kinderkommission des Deutschen Bundestags. (2019): Stellungnahme zum Thema „Kindeswohl und digitalisierte Gesellschaft: Chancen wahrnehmen – Risiken bannen”. http://www.bundestag.de/resource/blob/651028/ 0de1b58a7b242fe62c293a19f00cb055/2019-07-10-Stellungnahme-Kindeswohl-und-digitalisierte-Gesellschaft-data.pdf
Kolany-Raiser, B. (2016). Der Verbraucher als Datenlieferant: rechtliche Aspekte von “smarten” Produkten. In: Verbraucherzentrale Nordrhein-Westfalen e.V; Kompetenzzentrum Verbraucherforschung NRW (Hrsg.): Schöne neue Verbraucherwelt? Big Data, Scoring und das Internet der Dinge, Düsseldorf, S. 47–66.
Korczak, D. (2013). Ist der Erfolg von Alkoholpräventionsmaßnahmen mess- und evaluierbar? In: Suchttherapie, Bd. 14, Nr. 03, Thieme: Stuttgart, S. 114–118.
Korczak, D., Steinhauser, G., & Kuczera, C. (2012). Effektivität der ambulanten und staionären geriatrischen Rehabilitation bei Patienten mit der Nebendiagnose Demenz, Schriftenreihe Health Technology Assessment (HTA) (Vol. Bd. 122). DIMDI.
Korczak, D. (Ed.). (2007). Zukunftspotentiale der Nanotechnologien. Erwartungen, Anwendungen, Auswirkungen. Asanger.
Kucklick, C. (2014). Die granulare Gesellschaft. Wie das Digitale unsere Wirklichkeit auflöst. Ullstein.
Lankau, R. (2017). Kein Mensch lernt digital: Über den sinnvollen Einsatz neuer Medien im Unterricht. Beltz.
Lembke, G., & Leipner, I. (2015). Die Lüge der digitalen Bildung. Warum unsere Kinder das Lernen verlernen. Redline Verlag.
Lenzen, M. (2018). Künstliche Intelligenz. Was sie kann und was uns erwartet. Pieper.
Meurer, J., Müller, C., Simone, C., Wagner, I., & Wulf, V. (2018). Designing for Sustainability: Key Issues of ICT Projects for Ageing at Home. Computer Supported Cooperative Work (CSCW), 27(3-6), 495–537.
Misselhorn, C. (2018). Grundfragen der Maschinenethik. Reclam.
Mößle, T. (2012). Dick, dumm, abhängig, gewalttätig? Problematische Mediennutzungsmuster und ihre Folgen im Kindesalter. Ergebnisse des Berliner Längsschnitt Medien. Nomos Verlag.
Mort, M., Roberts, C., Pols, J., Domenech, M., & Moser, I. (2015). Ethical implications of home telecare for older people: a framework derived from a multisited participative study. Health Expectations, 18(3), 438–449.
Müller, R., & Bäumer, M. (2018). Out of office. Wenn Roboter und KI für uns arbeiten. Stiftung Historische Museen Hamburg.
Müller, C., Hornung, D., Hamm, T., & Wulf, V. (2015). Measures and Tools for Supporting ICT Appropriation by Elderly and Non Tech Savvy Persons in a Long Term Perspective (pp. 263–281).
Dierk, T., & Bahrs, O. (2018). Beiträge der Salutogenese zu Forschung, Theorie und Professionsentwicklung im Gesundheitswesen. In J.-G. Monika & P. Kriwy (Eds.), Handbuch Gesundheitssoziologie (pp. 1–28). Springer Fachmedien VS.
Münch, R. (2018). Der bildungsindustrielle Komplex. Schule und Unterricht im Wettbewerbsstaat. Beltz-Juvena.
Paternoga, D., Rätz, W., & Pietron, D. (2019). Eine andere Digitalisierung ist möglich. Chancen und Risiken einer vernetzten Gesellschaft. VSA.
projekt: futur iii – Digitaltechnik zwischen Freiheitsversprechen und Totalüberwachung: http://futur-iii.de
Ramge, T. (2018). Mensch und Maschine. Wie Künstliche Intelligenz und Roboter unser Leben verändern. Reclam.
Rümelin, N., & Weidenfeld, N. (2018). Digitaler Humanismus. Pieper.
Schmiedchen, F. et al. (2018). Stellungnahme zu den Asilomar-Prinzipien zu künstlicher Intelligenz. VDW e.V., Berlin. https://vdw-ev.de/wp-content/uploads/2018/05/Stellungnahme-SG-TA-Digitalisierung-der-VDW_April-2018.pdf
Schmiedchen, F., Kratzer, K. P., Link, J. S. A., & Stapf-Finé, H. (Eds.). (2022). The world we want to live in: Compendium of digitalisation, digital networks, and artificial intelligence. Logos Verlag.
Sobral, D., Rosenbaum, M., & Figueiredo-Braga, M. (2015). Computer use in primary care and patient physician communication. Patient Education and Counseling, 2015, 98(12), 1453–1652.
Spiekermann, S. (2019). Digitale Ethik. Ein Wertesystem für das 21. Jahrhundert. Droemer.
Spitzer, M. (2015). Cyberkrank. Wie das digitalisierte Leben unsere Gesundheit ruiniert. Droemer.
Techniker Krankenkasse. (2016). Beweg Dich Deutschland! TK Bewegungsstudie 2016, Hamburg.
Tretter, F. (2008). Ökologie der Person. Auf dem Weg zu einem systemischen Menschenbild: Perspektiven einer Systemphilosophie und ökologisch-systemischen Anthropologie. Pabst Science Publ.
Tegmark, M. (2017). Leben 3.0. Mensch sein im Zeitalter Künstlicher Intelligenz. Ullstein.
Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein, GP Forschungsgruppe. (2014). Scoring nach der Datenschutz-Novelle 2009 und neue Entwicklungen. Kiel.
Weizenbaum, J. (1978). Die Macht der Computer und die Ohnmacht der Vernunft. Suhrkamp.
Zierer, K. (2018). Die Grammatik des Lernens, in: FAZ, 4.10.2018, S. 7, https://www.faz.net/aktuell/feuilleton/hoch-schule/digitale-schule-die-grammatik-des-lernens-15819548.html
Zuboff, S. (2018). Das Zeitalter des Überwachungskapitalismus. Frankfurt am Main. Campus.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2022 The Author(s)
About this chapter
Cite this chapter
Graßl, H. et al. (2022). The Ambivalences of the Digital—Humans and Technology Between New Dreams/Spaces of Possibility and (Un)Noticeable Losses. In: Mieg, H.A. (eds) The Responsibility of Science. Studies in History and Philosophy of Science, vol 57. Springer, Cham. https://doi.org/10.1007/978-3-030-91597-1_11
Download citation
DOI: https://doi.org/10.1007/978-3-030-91597-1_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-91596-4
Online ISBN: 978-3-030-91597-1
eBook Packages: Religion and PhilosophyPhilosophy and Religion (R0)