Brain–Computer Interface (BCI) research is an interdisciplinary area of study within Neural Engineering. Recent interest in end-user perspectives has led to an intersection with user-centered design (UCD). The goal of user-centered design is to reduce the translational gap between researchers and potential end users. However, while qualitative studies have been conducted with end users of BCI technology, little is known about individual BCI researchers’ experience with and attitudes towards UCD. Given the scientific, financial, and ethical imperatives of UCD, we sought to gain a better understanding of practical and principled considerations for researchers who engage with end users. We conducted a qualitative interview case study with neural engineering researchers at a center dedicated to the creation of BCIs. Our analysis generated five themes common across interviews. The thematic analysis shows that participants identify multiple beneficiaries of their work, including other researchers, clinicians working with devices, device end users, and families and caregivers of device users. Participants value experience with device end users, and personal experience is the most meaningful type of interaction. They welcome (or even encourage) end-user input, but are skeptical of limited focus groups and case studies. They also recognize a tension between creating sophisticated devices and developing technology that will meet user needs. Finally, interviewees espouse functional, assistive goals for their technology, but describe uncertainty in what degree of function is “good enough” for individual end users. Based on these results, we offer preliminary recommendations for conducting future UCD studies in BCI and neural engineering.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
The “Matching Person and Technology (MPT) model is similar to UCD, although MPT focuses on matching individual persons with disabilities to existing assistive technology (Scherer 2002; Scherer et al. 2005), whereas UCD works to incorporate persons with disabilities into the technology design process.
The distinction between practical and principled considerations captures the difference between ethical reasons that focus on instrumental outcomes (e.g., benefit or well-being) and those that focus on intrinsic duties and obligations.
“Bidirectional” BCIs both read from and stimulate to the central nervous system (as opposed to stimulation alone, as in the case of a deep brain stimulator).
While researchers occasionally used the term “end users” to refer to clinicians and other researchers, in the remainder of the article we use this term to indicate persons with disabilities who are potential users of the BCI devices that the CSNE aims to create.
For example, see: https://braindanceenglish.wordpress.com/about-us/.
Birbaumer, N., Ghanayim, N., Hinterberger, T., Iversen, I., Kotchoubey, B., Kübler, A., Perelmouter, J., Taub, E., & Flor, H. (1999). A spelling device for the paralysed. Nature, 398, 297–298.
Blain-Moraes, S., Schaff, R., Gruis, K. L., Huggins, J. E., & Wren, P. A. (2012). Barriers to and mediators of brain–computer interface user acceptance: Focus group findings. Ergonomics, 55(5), 516–525.
Charlton, J. (1998). Nothing about us without us: Disability oppression and empowerment. California: University of California Press.
Chau, P. Y. K., & Tam, K. Y. (2000). Organizational adoption of open systems: A ‘technology-push, need-pull’ perspective. Informational and Management, 37, 229–239.
Collinger, J. L., Boninger, M. L., Bruns, T. M., Curley, K., Wang, W., & Weber, D. J. (2013). Functional priorities, assistive technology, and brain computer interfaces after spinal cord injury. Journal of Rehabilitation Research and Development, 50(2), 145–160.
Corbin, J. M., & Strauss, A. L. (2015). Basics of qualitative research: Techniques and procedures for developing grounded theory. Los Angeles: Sage.
Grubler, G., Al-Khodairy, A., Leeb, R., Pisotta, I., Riccio, A., Rohm, M., & Hildt, E. (2014). Psychosocial and ethical aspects in non-invasive EEG-based, BCI Research—a survey among BCI users and BCI professionals. Neuroethics, 7, 29–41.
Hochberg, L., & Anderson, K. (2012). BCI users and their needs. In J. R. Wolpaw & E. W. Wolpaw (Eds.), Brain–computer interfaces (pp. 317–323). New York: Oxford University Press.
Hochberg, L. R., Serruya, M. D., Friehs, G. M., Mukand, J. A., Saleh, M., Caplan, A. H., Branner, A., Chen, D., Penn, R. D., & Donoghue, J. P. (2006). Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature, 442, 164–172.
Holz, E. M., Kaufmann, T., Desideri, L., Malavasi, M., Hoogerwerf, E.-J., & Kubler, A. (2012). User centred design in BCI development. In B. Allison, S. Dunne, R. Leeb, J. D. R. Millan & A. Nijholt (Eds.), Towards practical brain–computer interfaces (pp. 155–172). Berlin: Springer.
Huggins, J. E., Wren, P. A., & Gruis, K. L. (2011). What would brain–computer interface users want? Opinions and priorities of potential users with amyotrophic lateral sclerosis. Amyotrophic Lateral Sclerosis, 12(5), 318–324.
ISO 9241–210. (2008). Ergonomics of human system interaction—Part 210: Human-centred design for interactive systems (formerly known as 13407). International Organization for Standardization (ISO) Switzerland.
Kübler, A., Mattia, D., Rupp, R., & Tangermann, M. (2013). Editorial: Facing the challenge: Bringing brain-computer interfaces to end users. Artificial Intelligence in Medicine, 59, 55–60.
Kübler, A., Müller-Putz, G., & Mattia, D. (2015). User-centred design in brain-computer interface research and development. Annals of Physical Rehabilitation and Medicine, 58(5), 312–314.
Kübler, A., Holz, E. M., Riccio, A., Zickler, C., Kaufmann, T., Kleih, S. C., Staiger-Salzer, P., Desideri, L., Hoogerwerf, E. J., & Mattia, D. (2014). The user-centered design as novel perspective for evaluating the usability of BCI-controlled applications. PLoS ONE. doi:10.1371/journal.pone.0112392.
Liberati, G., Pizzimenti, A., Simione, L., Riccio, A., Schettini, F., Inghilleri, M., Mattia, D., & Cincotti, F. (2015). Developing brain–computer interfaces from a user-centered perspective: Assessing the needs of persons with amyotrophic lateral sclerosis, caregivers, and professionals. Applied Ergonomics, 50, 139–146.
Lotte, F., Larrue, F., & Mühl, C. (2013). Flaws in current human training protocols for spontaneous brain-computer interfaces: Lessons learned from instructional design. Frontiers in Human Neuroscience, 7, 568.
McCullagh, P., Lightbody, G., Zygierewicz, J., & Kernohan, W. G. (2014). Ethical challenges associated with the development and deployment of brain computer interface technology. Neuroethics, 7, 109–122.
Murphy, M. D., Guggenmos, D. J., Bundy, D. T., & Nudo, R. J. (2016). Current challenges facing the translation of brain computer interfaces from preclinical trials to use in human patients. Frontiers in Cellular Neuroscience. doi:10.3389/fncel.2015.00497.
Nijboer, F. (2015). Technology transfer and of brain-computer interfaces as assistive technology: Barriers and opportunities. Annals of Physical and Rehabilitation Medicine, 58, 35–38.
Nijboer, F., Clausen, J., Allison, B. Z., & Haselager, P. (2013). The asilomar survey: Stakeholders’ opinions on ethical issues related to brain–computer interfacing. Neuroethics, 6, 541–578.
Powers, J. C., Bieliaieva, K., Wu, S., & Nam, C. S. (2015). The human factors and ergonomics of P300-based brain–computer interfaces. Brain Sciences, 5, 318–356.
Rao, R. (2013). Brain computer interfacing: An introduction. New York: Cambridge University Press.
Scherer, M. J. (2002). The change in emphasis from people to person: Introduction to the special issue on assistive technology. Disability and Rehabilitation, 24(1), 1–4.
Scherer, M. J., Sax, C., Vanbiervliet ,A., Cushman, L. A., & Scherer, J. V. (2005). Predictors of assistive technology use: The importance of personal and psychosocial factors. Disability and Rehabilitation, 27(21), 1321–1331.
Schicktanz, S., Amelung, T., Rieger, J. W. (2015). Qualitative assessment of patients’ attitudes and expectations toward BCIs and implications for future technology development. Frontiers in Systems Neuroscience, 9.
Schon, D. (1967). Technology and social change. New York: Delacorte.
Shih, J., Krusienski, D. J., & Wolpaw, J. R. (2012). Brain–computer interfaces in medicine. Mayo Clinical Proceedings, 87(3), 268–279.
Silvers, A. (2010). Better than new! ethics for assistive technologists. In M. M. K. Oishi, I. M. Mitchell, & H. F. M. Van der Loos (Eds.), Design and use of assistive technology: social, technical, ethical, and economic challenges (pp. 3–15). New York: Springer.
Specker Sullivan, L., & Illes, J. (2016). Beyond “communication and control”: Towards ethically complete rationales for brain–computer interface research. Brain–Computer Interfaces, 3(3), 156–163.
Williamson, T., Kenney, L., Barker, A. T., Cooper, G., Good, T., Healey, J., Heller, B., Howard, D., Matthews, M., Prenton, S., Ryan, J., & Smith, C. (2015). Enhancing public involvement in assistive technology design research. Disability and Rehabilitation Assistive Technology, 10(3), 258–265.
Wolbring, G., & Diep, L. (2016). Cognitive/neuroenhancement through an ability studies lens. In F. Jotterand & V. Dubljevic (Eds.), Cognitive enhancement: Ethical and policy implications in international perspectives (pp. 57–75). New York: Oxford University Press.
Wolpaw, J. R., & Wolpaw, E. W. (2012). Brain–computer interfaces: Principles and practice. New York: Oxford University Press.
Yuan, H., & He, B. (2014). Brain–computer interfaces using sensorimotor rhythms: Current state and future perspectives. IEEE Transactions in Biomedical Engineering, 61(5), 1425–1435.
Zickler, C., Halder, S., Kleih, S. C., Herbert, C., & Kübler, A. (2013). Brain painting: Usability testing according to the user-centered design in end users with severe motor paralysis. Artificial Intelligence in Medicine, 59(2), 99–110. https://doi.org/10.1016/j.artmed.2013.08.003.
The authors would like to thank all PIs at the CSNE for participating in this interview project. We also thank Judy Illes and the National Core for Neuroethics at the University of British Columbia for their assistance with the conceptualization of this project.
This work was supported by Award Number EEC-1028725 from the National Science Foundation. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Science Foundation.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the University of Washington Human Subjects Division and with the 1964 Helsinki declaration and its later amendments. Verbal informed consent was obtained from all individual participants included in the study.
Appendix: Interview Guide
Appendix: Interview Guide
Ethical Issues in the Lab
Have any ethical issues come up in your lab?
Do you foresee any ethical issues arising:
in the short term or near future?
in the long term, or likely in the more distant future?
How have you dealt with past ethical problems?
How you foresee dealing with future ethical prolems?
Do you think public policy or mass media shapes your work? If so, in what way?
Prompt: (think about FDA standards or policies, or movies or newspaper articles, or even potential interviews with the press about your work; the opening kick of the World Cup last summer came from a man with a exoskeleton)
How should CSNE researchers talk about and label implantable (or prosthetic) neural devices (in scientific articles or with the public) such that we accurately convey the limitations and actual function of such devices?
Prompt: Are there metaphors or shorthand language that we can (or should) use to present the technology in relatively accessible ways? (maybe mind-as-machine? something else?)
Prompt: Given that our discussions/representation of the brain and emerging neural technology can affect wider societal/cultural norms, how careful should we be in choosing metaphors and explanations of those things? (think about mind-as-machine metaphors, etc.).
Do you have any concerns about broader social forces that might shape this research (e.g., about how the money gets allocated or where it comes from - e.g., invested corporations, about reasons for investing in these technologies)? If so, can you describe them?
Who do you consider the end users of your research?
Prompt: Is there a specific group of people with disabilities that you think of as potential beneficiaries or end users for your research?
We recently conducted a focus group with individuals with spinal cord injury and asked them about their impressions of various kinds of neural technologies (including BBCI for spinal stimulation and BBCI for control of a robotic device). Participants in the focus group expressed several concerns, including ones we think may have relevance across different neurological conditions, and we want to explore some of these with you.
Are you optimistic that the kind of work going on in neural engineering will significantly improve the quality of life of:
…someone who currently has a spinal cord injury.
…someone who experiences a spinal cord injury in the next 5 years.
…somewho experiences a spinal cord injury in the next 15 years.
Why or why not?
Are you optimistic that the CSNE’s current research testbeds will significantly improve the quality of life of someone with a spinal cord injury? Why or why not?
If your lab’s work is related to spinal cord injury, how optimistic are you that your lab’s current research will significantly improve the quality of life of someone from that patient population? Why or why not? (If not, skip this question.)
Do you have any experience interacting with people with disabilities who might benefit from using these devices?
If yes, based on your interactions, do you think the view of persons with spinal cord injuries are optimistic or pessimistic about the potential benefits of neural devices?
If no, do you think the views of persons with spinal cord injuries are optimistic or pessimistic about the potential benefits of neural devices?
In your experience, do you think persons with spinal cord injuries are concerned about privacy of information collected in use of neural devices to a greater, lesser, or about the same degree as the general public is concerned about privacy in use of technology? Why do you think this?
In what way do you think the views of potential end users are shaped by: media portrayal of neurotechnologies?
What about cost? When, if ever, should the eventual cost of a neural device be considered?—what part of the research pathway?
Do you think that incorporating the views of persons with spinal cord injuries (or other neurological conditions) could improve the overall design of neural devices? Why or why not?
Do you think that incorporating the views of persons with spinal cord injuries (or other neurological conditions, if your project is directed to a different end user group) could improve your current contribution to the design of neural devices? Why, or why not?
If yes—i.e. If the views of persons with spinal cord injuries (or other neurological conditions) are valuable to the design process—what is the best way to…
obtain these views (e.g., focus groups or people with disabilities in the lab)?
make use of these views?
Incorporating End User Feedback
Sometimes end users’ perspectives, their values and priorities, do not match up with a research program in neural engineering. For example, consider the controversy around cochlear implants. And sometimes the general public is concerned about issues that aren’t priorities within a research program.
Focusing on potential end users, is it feasible to adapt your research to better account for the values or needs they have expressed?
If so, how do you think it can be done?
If not, why not?
More generally, are you free to spend time or resources on adjusting for any social or ethical implications of your research?
If not, what constrains you?
Are you in full control of the direction of research in your lab? What are the constraints on the direction of the research?
If the Center’s testbeds are successfully translated into technology that improves lives, would you feel responsible for the successes, at least in part? What if they are not successfully translated?
If no one is individually responsible for the end products of neural engineering (successful or not), should engineers and scientists strive to create conditions where researchers have more individual freedom to direct their research and its translation? How?
To what extent are your individual decisions in day-to-day research the cause of particular technologies that the Center develops? In other words, if someone asked you to explain the existence of a CSNE-created technology, would your day-to-day research decisions probably be part of that explanation?
Specific Ethical Issues
Have you considered issues related to safety? If so, which?
Prompt: What kinds of safety concerns should potential users of the technology be concerned about? How do you think these concerns should be addressed, and by whom?
Prompt: An implantable device and/or its components may have limited lifespan or be superseded by new technology (electrodes, power source, software). If an implantable device becomes outdated or replaced by better technology, who should be responsible for maintenance, repair or replacement of the “outdated” device?
Have you considered issues related to security? If so, which?
Prompt: How vulnerable is the technology you are working on to unauthorized access, and what measures can be taken to insure that your technology does not open up to attack?
Have you considered issues related to privacy? If so, which?
Prompt: Does the technology you are working on produce or collect sensitive data? If so, what measures ought we take to ensure that this data is handled appropriately?
Have you considered issues related to responsibility (i.e., who is responsible if using a BBCI leads to an accident)? If so, which?
Prompt: Do you think your technology will alter our understanding of responsibility for thoughts, moods, motivation, action, etc.?
Have you considered issues related to authority (i.e., how should control be shared between the individual and the device)? If so, which?
Prompt: Do you think it is important to ensure that the user is the ultimate controller of the information/device? If so, why? Would you have any concerns about the user having full control of the device?
Have you considered issues related to identity (i.e., will using a device potentially change a person’s sense of her self, of who she is)? If so, which?
Prompt: Have you considered how the use of your technology may alter individuals’ sense of identity? In designing your technology, have you considered issues such as how it looks to the end user, how interacting with the technology may alter a person’s sense of herself or her authentic self, or how it may alter others’ perception of her (even as “human”)? Is the aim for her to integrate the technology into her identity, or to consider it a tool that she uses, or something else?
Have you considered issues related to justice (i.e., will access to devices be fair)?
Prompt: How affordable do you envision the BBCI devices being? Have you considered ways that the technology might be made more affordable—even to low-income or low-resource communities—by design rather than as a result of market forces, government assistance, or philanthropy?
Have you considered issues related to what constitutes normal functioning and enhancement of normal functioning (i.e., where enhanced functioning puts someone at an advantage relative to what most people have)?
Prompt: Some people may prefer not to be “normalized” in the sense of standard functioning. What, if any, of the emerging technologies emphasize such normal functioning? Do you think there are concerns about how the standards of “normal” functioning are created and implemented in this research?
Prompt: Are there concerns about how a standard model of function might apply to a diverse range of people?
Prompt: Do you think of your technology as a form of treatment, or as an enhancement (or does that distinction not make sense in this arena)?
About this article
Cite this article
Sullivan, L.S., Klein, E., Brown, T. et al. Keeping Disability in Mind: A Case Study in Implantable Brain–Computer Interface Research. Sci Eng Ethics 24, 479–504 (2018). https://doi.org/10.1007/s11948-017-9928-9
- Brain–machine interface
- Brain–computer interface
- Research ethics
- User-centered design