Abstract
The exponential growth of emerging technologies opens up new opportunities for knowledge and its application, on human being, forcing us at the same time to rethink some traditional ethical and legal categories, such as freedom, responsibility, conscience, will, intention. The chapter focuses on neuroscience and neurotechnologies, gene-editing and genome-wide tests, the new paradigm of the 4P medicine (prediction, precision, personalization, participation), citizen science, the use of information and communication technologies, big data, mobile-health and biometrics, related to health and healthcare, underlining the main ethical and legal challenges.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
Cf. Levy (2007).
- 4.
- 5.
In this sense the criticism of Clarke (1999), pp. 279–293. It should be noted, incidentally, that the denial of temporal-causal isomorphism highlighted by Libet’s experiments can also be understood as a denial of reductionism: the rules of the mind do not identify sic et simpliciter with those of the brain.
- 6.
- 7.
- 8.
- 9.
Asilomar (1975)
- 10.
Baltimore et al. (2015), pp. 36–38.
- 11.
Lanphier et al. (2015).
- 12.
The document on Human Genome Editing: Science, Ethics and Governance addresses different aspects of the applications of gene-editing on human subjects, from laboratory experiments on somatic cells and germ cells and embryos to possible clinical trials in adults. Committee on Human Gene Editing (2017).
- 13.
Kaplan and Roy (2002).
- 14.
Cf. Citi GPS Global Perspectives and Solutions (2016), p. 935.
- 15.
In the text (p. 60) it states: “Important scientific and clinical issues relevant to human fertility and reproduction require continued laboratory research on human gametes and their progenitors, human embryos and pluripotent stem cells. This research is necessary for medical and scientific purposes that are not directed at heritable genome editing, though it will also provide valuable information and techniques that could be applied if heritable genome editing were to be attempted in the future”.
- 16.
- 17.
- 18.
- 19.
Van El et al. (2013).
- 20.
Knoppers (2014), pp. 6–10.
- 21.
Christenhusz (2013), pp. 248–255; Hehir-Kwa et al. (2015), pp. 1601–1606; Hellenic National Bioethics Commission (2015). In the Meeting Report it is stressed that the use of the expression incidental findings includes (1) unexpected positive findings, but also (2) the intentional search for pathogenic variants not associated with the primary diagnostic query. It is considered that the use of a different term, for example, “unexpected” or “secondary” or “unsolicited” findings, is just as problematic and therefore it is advised to keep to the most common use of ‘incidental findings’.
- 22.
Presidential Commission for the Study of Bioethical Issues (2012, 2013). The Commission distinguishes between “primary”, “secondary” and “discovery findings”: “primary findings” refer to a result that is actively sought, using a test or procedure designed to find such result; the “secondary findings” refer to the results that are actively sought by a professional, but which are not the “primary target”; the “discovery findings” refer to the results of wide tests, aimed at detecting any potentially interesting data.
- 23.
UNESCO, International Bioethics Committee (2015).
- 24.
- 25.
- 26.
National Research Council (2011).
- 27.
Commission staff working document (2013), p. 436.
- 28.
President Obama State of the Union Address, January 30th 2015.
- 29.
- 30.
Bayer and Galea (2015), pp. 499–501.
- 31.
European Group on Ethics in Science and New Technologies (2015).
- 32.
- 33.
PatientsLikeMe is a platform for sharing information and disease experiences; it enables patients to connect with patients of the same illness and encourages patients to share data and information. Members (more than 300,000) may choose different privacy settings that may be changed in time: shared data are accessible to third parties, non-shared data are not. The website reports aggregated data on symptoms and treatments that may be useful to patients. It is founded on an ‘openness philosophy’.
- 34.
PGP website: http://www.personalgenomes.org/.
- 35.
MIT Technology Review (2013).
- 36.
- 37.
- 38.
- 39.
In sites: national (health ministry, Scientific Societies, University) or international (WHO, EMEA, NIH, Medline, FDA; pharmaceutical industry sites; patient association sites).
- 40.
Cf. Müller et al. (2016), pp. 172–177.
- 41.
- 42.
There is no consolidated literature on the topic. There are documents and opinions of international and national ethics committees, which may contribute to the development of an ethical framework in this analysis. Among the main documents, it is worthwhile recalling: European Group on Ethics in Science and New Technologies (2015) and Opinion 7/2015 from the European Data Protection Supervisor on Meeting the Challenge of Big Data; UNESCO International Bioethics Committee (2017), OECD (2013, 2017), Italian Committee for Bioethics (2016) and Nuffield Council on Bioethics (2015).
- 43.
Raghupathil and Raghupathi (2014). There is a discussion on what forms of regulations could provide for data quality monitoring (collection, storage, analysis), as a requirement of data use, with flexible and updatable tools (as code of conducts), ensuring a control on competences and correctness of operators (clinicians, analysts, as engineers, statisticians, bioinformaticians) and correct interactions among them. Proposals include ‘soft regulation’ such as an updated code of practice for clinicians or other professionals involved in collecting health-related data. Others aim to foster interdisciplinarity between clinicians/researchers and engineers working together to translate and extend their existing and advanced data analysis technology (including on the one hand the clinically trained human mind), into targeted big data analytical approaches that will achieve clinically effective outputs. Although engineers and clinicians have long collaborated successfully, development work on “Big Data Healthcare” will particularly require mutual understanding by each disciplinary culture of the other. This will resort to further cultural development in both areas.
- 44.
The information requested is of an heterogeneous nature. With specific regard to health-related data, consideration should also be given to the fact that boundaries between the strictly medical and non-medical spheres are becoming increasingly blurred, like those between health and society; information on lifestyles and behaviours tends to become increasingly more relevant to health even within the perspective of prevention. In this sense, health information is not only deemed to be the outcome of laboratory tests or epidemiological data, but also the general news that comes from social networks.
- 45.
Bock (2016), p. 9.
- 46.
Wyber et al. (2015), pp. 203–208.
- 47.
WHO (2012), p. 27. Among the most significant documents on the subject adopted by the Council of Europe: Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data, Council of Europe, 1981 and additional protocol (Additional Protocol to Convention ETS No. 108 on Supervisory Authorities and Transborder Data Flows); Recommendation CM/Rec (2010)13 of the Committee of Ministers to Member States on the Protection of Individuals with Regard to Automatic Processing of Personal Data in the Context of Profiling (23 November 2010); Recommendation CM/Rec (2012)4 on the Protection of Human Rights with Regard to Social Networking Services; Recommendation CM/Rec (2014)6 on Human Rights for Internet Users; Recommendation CM/Rec (2016)1 on Protecting and Promoting the Right to Freedom of Expression and the Right to Private Life with Regard to Network Neutrality.
- 48.
Hoeren (2014), pp. 751–754.
- 49.
The delivery of healthcare services via mobile communication devices (2010 mHealth Summit FNIH) applies to both software and hardware, “Healthcare delivered wirelessly”.
- 50.
- 51.
Examples of medical applications for health: ‘Welp’ to detect the paralysis often characterizing seizures; applications for preventing falling of older people; applications to identify the first signs of Parkinson’s disease; applications to manage certain diseases (HIV, diabetes, chronic conditions).
- 52.
On these issues, there is already a wide-ranging debate: McCartney (2013), p. 181; Buijink et al. (2013), pp. 90–92; Haffey et al. (2013), pp. 111–117; Wolf et al. (2013), pp. 422–426. The problem that remains open on the practical level concerns the fact that the exponential rate of designing new apps makes it difficult to carry out the evaluation of the risk/benefit. Medical apps can work with different operating systems, so it is not possible to test them all for safety.
- 53.
E.g. some applications for asthma, food diary, ‘my recovery’ applied to preparation management for operations and after surgery, for rehabilitation; use of games to control panic attacks; home HIV testing, sexual diseases, streptococcus.
- 54.
E.g. if one collects data of the counting of the number of steps taken once only, such data reasonably will not lead to any inference. These are not data in a medical context and are not correlated with other data, therefore not relevant for research. But if they are systematically collected and combined with other data (e.g. gender difference, age, habits) they can become important for research.
- 55.
This is the line of thought based on the interpretation of para. 107 of the Explanatory Memorandum Recommendation N° 97) 5 on the Protection of Medical Data: “But even in cases where his/her consent is not required—that is, when the collection and processing of medical data follow an obligation under the law or under a contract, are provided for or authorised by law, or when the consent requirement is dispensed with—the recommendation provides that the data subject is entitled to relevant information”. Also Article 29 Working Party has recently published a document On apps on smart devices, which emphasizes the need to inform in a clear and unambiguous way the way in which the data are used (data type, purpose, period) before installation of the app. The right to be informed is also expressed in art. 10 Directive 95/46/EC; art. 5.3 of the ePrivacy Directive 2002/58/EC.
- 56.
- 57.
Cf. Italian Committee for Bioethics (2014).
- 58.
See Italian Committee for Bioethics (2015).
- 59.
Identification (“who is this person?”) is the determination of a subject’s identity by comparing a measured biometric in a database of records (a one-to-many comparison); verification (“is this person who he claims to be?”) corresponds to a one-to-one comparison between a measured biometric and a particular person. All biometrics can be used for verification, only some may be used for identification.
- 60.
- 61.
DNA analysis does not allow authentication in real time, as the other biometrics technologies. The temporal criterion is not covered in the definition of biometric technologies and therefore does not prevent to include DNA analysis among them.
- 62.
References
Araki, M., & Ishii, T. (2014). International regulatory landscape and integration of corrective genome editing into in vitro fertilization. Reproductive Biology and Endocrinology, 12, 108.
Asilomar. (1975). Asilomar Conference on Recombinant DNA.
Baltimore, D., Berg, P., Botchan, M., Carroll, D., Charo, R. A., Church, G., et al. (2015). A prudent path forward for genomic engineering and germline gene modification. Science, 348(6230), 36–38.
Bayer, R., & Galea, S. (2015). Public health in the precision-medicine era. New England Journal of Medicine, 373(6), 499–501.
Bert, F., Giacometti, M., Gualano, M. R., & Siliquini, R. (2014). Smartphones and health promotion: A review of the evidence. Journal of Medical Systems, 38(1), 9995.
Bird, S. J. (2005). Neuroethics. In C. Mitcham (Ed.), Encyclopedia of science, technology and ethics (Vol. III, p. 1310). New York: Thomson Gale.
Bock, C. (2016). Preserve personal freedom in networked societies. Broad anti-discrimination laws and practices could compensate for failing data protection and technology-linked loss of privacy. Nature, 537(7618), 9.
Bowker, G. C. (2014). Big Data, Big Questions. The Theory/Data thing. International Journal of Communication, 8, 1795–1799. Los Angeles, USC Annenberg Press.
Buijink, A. W., Visser, B. J., & Marshall, L. (2013). Medical apps for smartphones: Lack of evidence undermines quality and safety. Evidence-Based Medicine, 18(3), 90–92.
Christenhusz, G. M. (2013). To tell or not to tell? A systematic review of ethical reflections on incidental findings arising in genetics contexts. European Journal of Human Genetics, 21, 248–255.
Citi GPS Global Perspectives and Solutions. (2016). Technology at work v2.0. The future is not what it used to be, January 2016.
Clarke, T. W. (1999). Fear for mechanism: A compatibilist critique of the “Volition Brain”. Journal of Consciusness Studies, 6, 279–293.
Cohen, G., & Lynch, H. F. (2018). In E. Vayena & U. Gasser (Eds.), Big data, health law, and bioethics. Cambridge: Cambridge University Press.
Collins, F. S., & Varmus, H. (2015). A new initiative on precision medicine. The New England Journal of Medicine, 372(9), 793–795. p. 26.
Commission staff working document. (2013). Use of ‘-omics’ technologies in the development of personalised medicine, Brussels, 25 October 2013 SWD(2013), p. 436.
Committee on Human Gene Editing. (2017). Scientific, medical, and ethical considerations, A Report of the National Academy of Sciences and the National Academy of Medicine, Human Genome Editing: Science, Ethics and Governance (p. 2017). Washington D.C: The National Academies Press.
Coote, J. H., & Joyner, M. J. (2015). Is precision medicine the route to a healthy world? Lancet, 385, 1617.
Damasio, A. R. (2005). Descartes error: Emotion, reason, and the human brain. New York: Avon Books.
Damasio, A. R. (2007). Neuroscience and ethics: Intersections. American Journal of Bioethics, 7, 3–7.
Davis, A. (1997). The body as password. In “On Newsstands Now”, issue 5.07, July 1997.
De Caro, M., & MacArthur, D. (Eds.). (2004). Naturalism in question. Cambridge: Harvard University Press.
European Group of Ethics in Science and New Technologies. (2012). Ethics of information and communication technologies.
European Group on Ethics in Science and New Technologies. (2014). Ethics of security and surveillance technologies.
European Group on Ethics in Science and New Technologies. (2015). The Ethical implications of new health tecnologies and citizen participation.
European Group on Ethics in Science and New Technologies (EGE). (1999). Ethical issues of healthcare in the information society.
Farah, M. J. (2002). Emergent ethical issues in neuroscience. Nature Neuroscience, 5, 1123–1129.
Farah, M. J. (2007). Social, legal and ethical implications of cognitive neuroscience: “Neuroethics” for short. Journal of Cognitive Neuroscience, 19, 363–364.
Farah, M. J., & Heberlein, A. S. (2007). Personhood and neuroscience: Naturalizing or nihilating. The American Journal of Bioethics, 71, 37–48.
Flores, M., Glusman, G., Brogaard, K., Price, N. D., & Hood, L. (2013). P4 medicine: How systems medicine will transform the healthcare sector and society. Personalized Medicine, 10(6), 565–576.
Frost, J., Okun, S., Vaughan, T., Heywood, J., & Wicks, P. (2011). Patients-reported outcomes as a sources of evidence in off-label prescribing: Analysis of data from patientslikeme. Journal of Medical Internet Research, 13, e6.
Garland, G. (Ed.). (2004). Neuroscience and the law: Brain, mind and the scales of justice. Chicago: University of Chicago Press.
Gazzaniga, M. (2008). The law and neuroscience. Neuron, 60, 412–415.
Giacometti, M., Gualano, M. R., Bert, F., & Siliquini, R. (2013). Public health accessible to all: Use of smartphones in the context of healthcare in Italy. Igiene e sanita pubblica, 69(2), 249–259.
Greene, J., & Cohen, J. (2004). For the law, neuroscience changes nothing and everything. Philosophical Transactions of the Royal Society of London, 359, 1775–1785.
Haffey, F., Brady, R. R., & Maxwell, S. (2013). A comparison of the reliability of smartphone apps for opioid conversion. Drug Safety, 36(2), 111–117.
Hehir-Kwa, J. K., Claustres, M., Hastings, R., Van Ravenswaaij-arts, C., Christenhusz, G., Genuardi, M., et al. (2015). Meeting report. Towards a European consensus for reporting incidental findings during clinical NGS testing. European Journal of Human Genetics, 23, 1601–1606.
Hellenic National Bioethics Commission. (2015). Incidental findings in research and clinical practice.
Hoeren, T. (2014). Big data and the ownership in data: Recent developments in Europe. European Intellectual Property Review, 36(12), 751–754. Sweet & Maxwell, London.
Hood, L., & Flores, M. (2012). A personal view on systems medicine and the emergence of proactive P4 medicine: Predictive, preventive, personalized and participatory. New Biotechnology, 29(6), 613–624.
Illes, J. (Ed.). (2006). Neuroethics: Defining the issues in theory, practice, and policy. New York: Oxford University Press.
Italian Committee for Bioethics. (1999). Bioethical guidelines for genetic testing.
Italian Committee for Bioethics. (2006). Ethics, health and new information technologies, 2006.
Italian Committee for Bioethics. (2014). Lifestyles and health protection.
Italian Committee for Bioethics. (2015). Mobile health apps: Bioethical aspects.
Italian Committee for Bioethics. (2016). ICT and big data: Bioethical issues.
Italian Committee for Bioethics. (2017a). Ethical issues in gene-editing and CrisprCas9 technique.
Italian Commitee for Bioethics. (2017b). Managing “incidental findings” in genomic investigations in new genomic platforms.
Jain, A. K., Bolle, R., & Pankanti, S. (1998). Biometrics: Personal identification in networked society. Dordrecht: Kluwer Academic Publisher Group.
Jasanoff, S., Huribut, J. B., & Saha, K. (2015). CRISPR democracy: Gene editing and the need for inclusive deliberation. Issues in Science and Technology, 32(1), 37.
Joyner, M. J., & Paneth, M. (2015). Seven questions for personalised medicine. JAMA, 314(10), 999–1000.
Kaplan, M., & Roy, I. (2002). Accidental germ-line modification through somatic cell gene therapy. American Journal of Bioethics, 2, 1.
Kaye, J., Curren, L., Anderson, N., Edwards, K., Fullerton, S. M., Kanellopoulou, N., et al. (2012). From patients to partners: Participant-centric initiatives in biomedical research. Nature Reviews Genetics, 13, 371–376.
Knoppers, B. M. (2014). Introduction from the right to know to the right not to know. The Journal of Law, Medicine & Ethics, 42, 6–10. Spring 2014, Special Issue: Symposium: The Right Not to Know.
Lanphier, E., Urnov, F., Haecker, S. E., Werner, M., & Smolenski, J. (2015). Do Not Edit the Human Germ Line. Nature, 519(7544), 410.
Levy, N. (2007). Neuroethics: Challenges to the 21st century. Cambridge: Cambridge University Press.
Libet, B. (1985). Unconscious cerebral initiative and the role of conscious will in voluntary action. The Behavioral and Brain Sciences, 8, 529–566.
Libet, B. (1999). Do we have free will? Journal of Consciousness Studies, 6(8–9), 47–57.
Libet, B., Gleason, C., Wright, E., & Pearl, D. (1983). Time of unconscious intention to act in relation to onset of cerebral activity (readiness-potential). Brain, 106, 623–642.
Mantovani, E., Quinn, P., Guihen, B., Habbig, A., & Hert, P. (2013). eHealth to mHealth – A journey precariously dependent upon apps? European Journal of ePractice, 20, 48–66.
Mayer-Schönberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work and think. London: John Murray Publisher.
McCartney, M. (2013). How do we know whether medical apps work? British Medical Journal, 346, 181.
MIT Technology Review. (2013). Participants in personal genome project identified by privacy experts, 1st May 2013.
Mittelstadt, B. D., & Floridi, L. (2016). The ethics of big data: Current and foreseeable issues in biomedical contexts. Science and Engineering Ethics, 22, 303–341. Springer, New York.
Mordini, E., & Petrini, C. (2007). Ethical and social implications of biometric identification technologies. Annali Istituto Superiore di Sanità, 43(1), 5–11.
Mueller, M., Tippins, D., & Bryan, L. (2013). The future of citizen science. Democracy and Education, 10, 1.
Müller, K. W., Dreier, M., Beutel, M. E., Duven, E., Giralt, S., & Wölfling, K. (2016). A hidden type of internet addiction? Intense and addictive use of social networking sites in adolescents. Computers in Human Behavior, 55, 172–177.
National Research Council. (2011). Toward precision medicine: Building a knowledge network for biomedical research and a new taxonomy of disease. Washington DC: National Academies Press.
Nuffield Council on Bioethics. (2015). The collection, linking and use of data in biomedical research and healthcare: Ethical issues.
Nuffield Council on Bioethics. (2016). Public dialogue on genome editing. Why? When? Who? Report of a Workshop on Public Dialogue for Genome-editing.
Nuffield Council on Bioethics. (2018). Genome editing and human reproduction: Social and ethical issues.
OECD. (2013). Strengthening health information infrastructure for health care quality governance: Good practices, new opportunities and data protection challenges.
OECD. (2017). Recommendation on health data governance.
Parker, M. (2012). Ethical considerations related to mobile technology use in medical research. Journal of Mobile Technology in Medicine, 1(3), 50–52.
Pockett, S. (2004). Does consciousness cause behaviour? Journal of Consciousness Studies, 11, 23–40.
Prainsack, B. (2014). Understanding participation: The “Citizen Science” of genetics. In B. Prainsack, G. Werner-Felmayer, & G. Schicktanz (Eds.), Genetics as social practice. Farnham: Ashgate.
Prainsack, B. (2018). Personalized medicine. Empowered patients in the 21st century? New York: New York University Press.
Presidential Commission for the Study of Bioethical Issues. (2012). Privacy and progress in whole genome sequencing.
Presidential Commission for the Study of Bioethical Issues. (2013). Anticipate and communicate: Ethical management of incidental and secondary findings in the clinical, research, and direct-to-consumer contexts.
Raghupathil, W., & Raghupathi, V. (2014). Big data analytics in healthcare: Promise and potential. Health Information Science and Systems, 2, 3.
Seife, C. (2013). 23andMe is terrifying, but not for the reasons the FDA thinks. Scientific American. 27 November 2013.
Shringarpure, S. S., & Busamante, C. D. (2015). Privacy risks from genomic data-sharing beacons. American Journal of Human Genetics, 97, 1–18.
Siòlberman, M. J., & Clark, L. (2012). M-Health: The Union of technology and healthcare regulations. Phoenix: Greenbranch Publishing.
Spence, S. (1996). Free will in the light of neuropsychiatry. Philosophy, Psychiatry, and Psychology, 3, 75–90.
Tancredi, L. R. (2005). Hardwired behavior: What neuroscience reveals about morality. New York: Cambridge University Press.
UNESCO. (2011). Code of conduct for the information society.
UNESCO, International Bioethics Committee. (2015b). Report on updating its reflection on the human genome and human rights.
UNESCO, International Bioethics Committee. (2017). Big data and health.
Van El, G., Cornel, M. C., Borry, P., Hastings, R. J., Fellmann, F., Hodgson, S. V., et al. (2013). Whole-genome sequencing in health care. Recommendations of the European Society of human genetics. European Journal of Human Genetics, 21(Suppl. 1), S1–S5.
Vayena, E., & Tasioulas, J. (2015). “We the Scientists”: A human right to citizen science. Philosophy and Technology, 28, 479–485.
Vincent, N. O. (2010). On the relevance of neuroscience to criminal responsibility. Criminal Law and Philosophy, 4, 77–98.
Weber, J. C. (2014). Personalized medicine, promises and expectations, Lettre du CEERE (Centre Européen d’Enseignement et de Recherche de l’Université de Strasbourg), (n. 80), décembre 2014, (pp. 2–3).
WHO. (2012). Legal frameworks for eHealth, based on the findings of the second global survey on eHealth. Global Observatory for eHealth series, 5, 27.
Wolf, J. A., Moreau, J. F., Akilov, O., Patton, T., Inglese, J. C., Ho, J., et al. (2013). Diagnostic inaccuracy of smartphone applications for melanoma detection. JAMA Dermatology, 149(4), 422–426.
Wolpe, S. (2004). Neuroethics. In S. G. Post (Ed.), Encyclopedia of bioethics (Vol. IV, pp. 1894–1898). New York: Thomson Gale.
Wyber, R., Vaillancourt, S., Perry, W., Mannava, P., Folaranmi, T., & Celi, L. A. (2015). Big data in global health: Improving health in low- and middle-income countries. Bulletin of the World Health Organization, 93, 203–208. WHO, Geneva.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG and G. Giappichelli Editore
About this chapter
Cite this chapter
Palazzani, L. (2019). Emerging Technologies and Health. In: Innovation in Scientific Research and Emerging Technologies. Springer, Cham. https://doi.org/10.1007/978-3-030-16733-2_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-16733-2_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-16732-5
Online ISBN: 978-3-030-16733-2
eBook Packages: Law and CriminologyLaw and Criminology (R0)