Abstract
This critique explores the implications of integrating artificial intelligence (AI) technology, specifically OpenAI’s advanced language model GPT-4 and its interface, ChatGPT, into the field of spinal surgery. It examines the potential effects of algorithmic bias, unique challenges in surgical domains, access and equity issues, cost implications, global disparities in technology adoption, and the concept of technological determinism. It posits that biases present in AI training data may impact the quality and equity of healthcare outcomes. Challenges related to the unique nature of surgical procedures, including real-time decision-making, are also addressed. Concerns over access, equity, and cost implications underscore the potential for exacerbated healthcare disparities. Global disparities in technology adoption highlight the importance of global collaboration, technology transfer, and capacity building. Finally, the critique challenges the notion of technological determinism, emphasizing the continued importance of human judgement and patient-care provider relationship in healthcare. The critique calls for a comprehensive evaluation of AI technology integration in healthcare to ensure equitable and quality care.
Avoid common mistakes on your manuscript.
Dear Editor,
I am writing to provide a critique and analysis of the recent paper titled “Will ChatGPT/GPT-4 be a Lighthouse to Guide Spinal Surgeons?” by He et al. [1]. While the authors discuss potential applications of OpenAI’s advanced large language model GPT-4 and its chatbot-style interface ChatGPT in spinal surgery, it is crucial to examine the broader implications of technological determinism, biases, challenges in surgical domains, access and equity issues, cost implications, and global disparities in technology adoption within the context of clinical medicine and the biomedical enterprise.
Firstly, it is essential to address the issue of algorithmic bias in AI applications. Algorithms, including language models like GPT-4, can inherit and perpetuate biases present in the data they are trained on. These biases may disproportionately impact marginalized populations, leading to unequal healthcare outcomes [2, 3]. It is crucial to thoroughly evaluate and mitigate biases to ensure fair and equitable treatment for all patients. For example, Ahsen et al.'s study on breast cancer diagnosis demonstrated that biases inherited by algorithms from data generated by humans can diminish performance [4]. However, a bias-aware algorithm integrated into a clinical decision support system was able to mitigate the adverse impact of bias, improving the accuracy of decisions based on mammography. Specific examples of algorithmic biases relevant to spinal surgery must be explored to assess their potential impact on diagnostic accuracy, treatment recommendations, and surgical decision-making. Once identified, mitigating algorithmic biases in AI applications, including language models like GPT-4, is crucial in spinal surgery to ensure equitable care.
Furthermore, when considering the integration of AI technologies in surgical domains, unique challenges arise. Surgical procedures require real-time decision-making, precise manual dexterity, and adaptability to unforeseen circumstances. While ChatGPT/GPT-4 may assist with information retrieval and decision support, its ability to account for dynamic surgical situations and respond appropriately remains unverified. The limitations and challenges faced by AI in surgical domains should be acknowledged, including the need for rigorous validation, addressing technical limitations, and ensuring seamless integration with existing surgical workflows (Table 1).
The discussion of ChatGPT/GPT-4 in spinal surgery cannot overlook access and equity issues. Implementing AI-driven technologies requires robust infrastructure, adequate training, and financial resources [5]. However, access to such resources may be limited in certain regions or healthcare settings, exacerbating healthcare disparities. The potential cost implications associated with the adoption and maintenance of AI systems, as well as training healthcare professionals, should be carefully considered to ensure equitable access for all patients, regardless of socioeconomic status or geographical location.
Global access to advanced technologies like ChatGPT/GPT-4 faces significant obstacles. Technological advancements are often concentrated in high-income regions and prestigious institutions, creating a digital divide that hampers equal access to cutting-edge tools. Developing countries and underserved communities may struggle to overcome barriers related to infrastructure, funding, and expertise. Efforts should be directed toward promoting global collaboration, technology transfer, and capacity building to address these disparities and foster equitable healthcare delivery.
The notion of technological determinism in clinical medicine and the biomedical enterprise must also be critically examined. While AI technologies like ChatGPT/GPT-4 hold promise, they should not be viewed as a panacea for all healthcare challenges. Human judgment, experience, and the doctor-patient relationship remain integral to medical practice. Emphasizing technological determinism can undermine the essential role of healthcare professionals, potentially leading to overreliance on AI systems and neglecting critical aspects of patient care that require human expertise and compassion [6].
In conclusion, I urge a comprehensive evaluation of the potential implications of ChatGPT/GPT-4 in spinal surgery, considering algorithmic biases, challenges in surgical domains, access and equity issues, cost implications, global disparities, and the impact of technological determinism. Responsible integration of AI technologies in healthcare should prioritize patient well-being, address biases, ensure equitable access, and maintain the essential role of healthcare professionals in decision-making and patient care.
Sincerely,
Aaron Lawson McLean
References
He, Y., H. Tang, D. Wang, S. Gu, G. Ni, and H. Wu. Will ChatGPT/GPT-4 be a lighthouse to guide spinal surgeons? Ann. Biomed. Eng. 51(7):1362–1365, 2023.
DeCamp, M., and C. Lindvall. Latent bias and the implementation of artificial intelligence in medicine. J. Am. Med. Inform. Assoc. 27(12):2020–2023, 2020.
Favaretto M, De Clercq E, Elger BS: Big Data and discrimination: perils, promises and solutions. A systematic review. Journal of Big Data 2019, 6(1).
Ahsen, M. E., M. U. S. Ayvaci, and S. Raghunathan. When algorithmic predictions use human-generated data: a bias-aware classification algorithm for breast cancer diagnosis. Inf. Syst. Res. 30(1):97–116, 2019.
Mikalef, P., and M. Gupta. Artificial intelligence capability: conceptualization, measurement calibration, and empirical study on its impact on organizational creativity and firm performance. Inf. Manag. 58(3):103434, 2021.
Henwood, F., and B. Marent. Understanding digital health: Productive tensions at the intersection of sociology of health and science and technology studies. Sociol. Health Illness. 41(S1):1–15, 2019.
Funding
Open Access funding enabled and organized by Projekt DEAL. The author did not receive support from any organization for the submitted work.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The author has no relevant financial or non-financial interests to disclose.
Additional information
Associate Editor Stefan M. Duma oversaw the review of this article.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Lawson McLean, A. Towards Precision Medicine in Spinal Surgery: Leveraging AI Technologies. Ann Biomed Eng 52, 735–737 (2024). https://doi.org/10.1007/s10439-023-03315-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10439-023-03315-w