It is always too early to assess a technology, until suddenly it is too late.
Martin Buxton (Buxton 1987).
Brain–computer interfacing technologies are used as assistive technologies for patients as well as healthy subjects to control devices solely by brain activity. Yet the risks associated with the misuse of these technologies remain largely unexplored. Recent findings have shown that BCIs are potentially vulnerable to cybercriminality. This opens the prospect of “neurocrime”: extending the range of computer-crime to neural devices. This paper explores a type of neurocrime that we call brain-hacking as it aims at the illicit access to and manipulation of neural information and computation. As neural computation underlies cognition, behavior and our self-determination as persons, a careful analysis of the emerging risks of malicious brain-hacking is paramount, and ethical safeguards against these risks should be considered early in design and regulation. This contribution is aimed at raising awareness of the emerging risk of malicious brain-hacking and takes a first step in developing an ethical and legal reflection on those risks.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Price excludes VAT (USA)
Tax calculation will be finalised during checkout.
The notion of biological information is used in this paper to extensively refer to information expressed in the processes characteristic of living organisms at various levels, i.e. at the levels of molecules, cells, organs, circuits etc. This definition is in accordance with the statistical definition of information formulated by Claude Shannon and used in mathematical information theory (Shannon 1949). In Shannon’s sense, “anything is a source of information if it has a range of possible states, and one variable carries information about another to the extent that their states are physically correlated”. For a comprehensive understanding of the notion of biological information see: (Godfrey-Smith and Sterelny 2007).
Deep brain stimulation (DBS) is an invasive neurostimulation technique which involves the neurosurgical implantation of a medical device into the brain. This implanted device sends electrical signals into targeted subcortical areas with the aim of eliciting activity. DBS is an increasingly used therapy for several neurological conditions such as Parkinson’s disease, dystonias, essential tremor, and chronic pain syndromes when patients are not responding to less invasive approaches (Tronnier and Rasche 2015).
Transcranial direct current stimulation is a neuromodulatory intervention which uses constant, low electrical current delivered to the cortical area of interest via small electrodes placed on the skull with the aim of changing neuronal excitability in that area (Brunoni et al. 2012). This change of neuronal excitability may influence, and in certain cases enhance cognitive performance for a brief period of time on a number of different cognitive tasks.
See, for example, the following two magazine reviews: (Conner 2010; Strickland 2014). Although concerns expressed by popular media may at times be exaggerated, they still may require appropriate responses by scientists and ethicists, if only to diminish or forestall unrealistic worries amongst the general public.
http://www.nielsen.com/us/en.html (last accessed May 3, 2015).
It is worth noting that there are two potential meanings of input here: (1) the user provides input to the BCI through brain activity; (2) the interface provides information (e.g. a screen with commands) to the user. To disambiguate, in this section we will refer exclusively to the latter as this type of input is the only one whose hackability was proven in the experimental setting.
The ambiguous term ‘thinking about’ is defined by the authors as ‘being primed on‘. Since the priming effect occurs for many types of stimuli (e.g. words, sounds, and images) the authors assumed that a subject can prime himself by being told to think about an object. See van Vliet et al. (2010, p. 183).
In order to quantify the information leak that the BCI attack provides, the researchers compared the Shannon entropies of guessing the correct answers for the classifiers against the entropy of the random guess attack. The entropy difference directly measures the information leaked by an attack; see Martinovic et al. (2012, p. 11).
It is worth noting that the first strategy (adding noise) is similar to the one discussed in “Measurement manipulation”section with regard to measurement manipulation. However, at this level, the consequence we discuss may be different as the aim of the intervention here is to delay or complicate the decoding process.
Here too, there is a difference between hacking by disruption and hijacking, as the psychological stress involved in doing something different from what the user intended may differ from the traumatic experience of losing control over oneself.
Allison, B. Z., Wolpaw, E. W., & Wolpaw, J. R. (2007). Brain–computer interface systems: Progress and prospects. Expert Review of Medical Devices, 4(4), 463–474. doi:10.1586/174344126.96.36.1993.
Anderson, J. (2013). Autonomy. In The International Encyclopedia of Ethics. Blackwell Publishing Ltd. http://dx.doi.org/10.1002/9781444367072.wbiee716
Beauchamp, T. L., & Childress, J. F. (2001). Principles of biomedical ethics. New York: Oxford University Press.
Bonaci, T., Calo, R., & Chizeck, H. J. (2014). App stores for the brain: Privacy & security in brain–computer interfaces. In IEEE international symposium on ethics in science, technology and engineering, 2014.
Brunoni, A. R., Nitsche, M. A., Bolognini, N., Bikson, M., Wagner, T., Merabet, L., et al. (2012). Clinical research with transcranial direct current stimulation (tDCS): Challenges and future directions. Brain Stimulation, 5(3), 175–195. doi:10.1016/j.brs.2011.03.002.
Buss, S. (2002). Personal autonomy. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Winter 2014 Edition). http://plato.stanford.edu/archives/win2014/entries/personal-autonomy/.
Buxton, M. (1987). Problems in the economic appraisal of new health technology: The evaluation of heart transplants in the UK (pp. 103–118). Oxford, England: Oxford Medical Publications.
Chizeck, H. J., & Bonaci, T. (2014). Brain–computer interface anonymizer. Google Patents.
Clausen, J. (2011). Conceptual and ethical issues with brain–hardware interfaces. Current Opinion in Psychiatry, 24(6), 495–501.
Conner, M. (2010). Hacking the brain: Brain-to-computer interface hardware moves from the realm of research. EDN, 55(22), 30–35.
Denning, T., Matsuoka, Y., & Kohno, T. (2009). Neurosecurity: Security and privacy for neural devices. Neurosurgical Focus, 27(1), E7.
Dupont, B. (2013). Cybersecurity futures: How can we regulate emergent risks? Technology Innovation Management Review, 3(7), 6–11.
Evans, D. (2011). The internet of things: How the next evolution of the internet is changing everything. CISCO white paper, 1.
Fazel-Rezai, R., Allison, B. Z., Guger, C., Sellers, E. W., Kleih, S. C., & Kübler, A. (2012). P300 brain computer interface: Current challenges and emerging trends. Frontiers in Neuroengineering, 5(14), 14.
Fetz, E. E. (2015). Restoring motor function with bidirectional neural interfaces. Progress in Brain Research, 218, 241–252.
Godfrey-Smith, P., & Sterelny, K. (2007). Biological information. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Summer 2016 Edition). http://plato.stanford.edu/archives/sum2016/entries/information-biological/.
Halder, D., & Jaishankar, K. (2011). Cyber crime and the victimization of women: Laws, rights, and regulations. Hershey, PA: IGI Global. ISBN 978-1-60960-830-9.
Halperin, D., Heydt-Benjamin, T. S., Ransford, B., Clark, S. S., Defend, B., Morgan, W., et al. (2008). Pacemakers and implantable cardiac defibrillators: Software radio attacks and zero-power defenses. In IEEE symposium on security and privacy, 2008, SP 2008.
Haselager, P. (2013). Did I do that? Brain–computer interfacing and the sense of agency. Minds and Machines, 23(3), 405–418.
Heisenberg, D. (2005). Negotiating privacy: The European Union, the United States, and personal data protection. Boulder, CO: Lynne Rienner Publishers.
Kotchetkov, I. S., Hwang, B. Y., Appelboom, G., Kellner, C. P., & Connolly, E. S, Jr. (2010). Brain–computer interfaces: Military, neurosurgical, and ethical perspective. Neurosurgical Focus, 28(5), E25.
Li, Q., Ding, D., & Conti, M. (2015). Brain–computer interface applications: Security and privacy challenges. In IEEE conference on communications and network security (CNS), 2015.
Martinovic, I., Davies, D., Frank, M., Perito, D., Ros, T., & Song, D. (2012). On the feasibility of side-channel attacks with brain–computer interfaces. In USENIX security symposium.
Mill, J. S. (1869). On liberty. London: Longmans, Green, Reader, and Dyer.
Miranda, R. A., Casebeer, W. D., Hein, A. M., Judy, J. W., Krotkov, E. P., Laabs, T. L., et al. (2015). DARPA-funded efforts in the development of novel brain–computer interface technologies. Journal of Neuroscience Methods, 244, 52–67.
Powell, C., Munetomo, M., Schlueter, M., & Mizukoshi, M. (2013). Towards thought control of next-generation wearable computing devices. In K. Imamura, S. Usui, T. Shirao, T. Kasamatsu, L. Schwabe & N. Zhong (Eds.), Brain and Health Informatics (pp. 427–438). Springer.
Pustovit, S. V., & Williams, E. D. (2010). Philosophical aspects of dual use technologies. Science and Engineering Ethics, 16(1), 17–31.
Rosenfeld, J. P. (2011). P300 in detecting concealed information. In Verschuere, B., Ben-Shakhar, G., & Meijer, E. (Eds.), Memory detection: Theory and application of the concealed information test (pp. 63–89). Cambridge University Press.
Rosenfeld, J. P., Biroschak, J. R., & Furedy, J. J. (2006). P300-based detection of concealed autobiographical versus incidentally acquired information in target and non-target paradigms. International Journal of Psychophysiology, 60(3), 251–259.
Shannon, C. (1949). The mathematical theory of environments. The mathematical theory of communication (pp. 1–93). Urbana: University of Illinois Press.
Strickland, E. (2014). Brain hacking: Self-experimenters are zapping their heads. IEEE Spectrum, 51(5), 23–25. doi:10.1109/mspec.2014.6808452.
Tronnier, V. M., & Rasche, D. (2015). Deep brain stimulation. In Textbook of Neuromodulation (pp. 61–72). New York: Springer.
Vallabhaneni, A., Wang, T., & He, B. (2005). Brain–computer interface. In Neural Engineering (pp. 85–121). New York: Springer.
van Gerven, M., Farquhar, J., Schaefer, R., Vlek, R., Geuze, J., Nijholt, A., & Gielen, S. (2009). The brain–computer interface cycle. Journal of Neural Engineering, 6(4), 041001.
van Vliet, M., Mühl, C., Reuderink, B., & Poel, M. (2010). Guessing what’s on your mind: using the N400 in Brain Computer Interfaces. In Y. Yao, R. Sun, T. Poggio, J. Liu, N. Zhong, J. Huang (Eds.), Brain Informatics (pp. 180–191). Berlin Heidelberg: Springer.
Varelius, J. (2006). The value of autonomy in medical ethics. Medicine, Health Care and Philosophy, 9(3), 377–388.
Wechsler, H. (1968). Codification of criminal law in the United States: The model penal code. Columbia Law Review, 68(8), 1425–1456.
Westby, J. R. (2004). International guide to privacy. American Bar Association, Privacy & Computer Crime Committee, and American Bar Association, Section of Science & Technology Law.
Yuan, B. J., Hsieh, C.-H., & Chang, C.-C. (2010). National technology foresight research: A literature review from 1984 to 2005. International Journal of Foresight and Innovation Policy, 6(1), 5–35.
This project was partly supported by the Erasmus Mundus Scholarship (European Commission).
Conflict of interest
The authors declare that they have no competing interests.
About this article
Cite this article
Ienca, M., Haselager, P. Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity. Ethics Inf Technol 18, 117–129 (2016). https://doi.org/10.1007/s10676-016-9398-9