Keywords

1 Introduction

The COVID-19 pandemic has clearly exposed the discrepancies between countries in terms of prevention, preparedness, and response (PPR) to disease outbreaks, often stretching health systems to their limits. Even though the pandemic demonstrated that all countries need to strengthen their PPR capacity, it is clear that many low- and middle-income countries (LMICs) were heavily affected due to their historical limitations in healthcare provisioning, including: an insufficient number of well-trained professionals; inadequate hospital bed and laboratory capacity; low investment on technology; and, crisis management expertise, including supply chain logistics for equipment and medication (Krubiner et al. 2020).

Furthermore, the pandemic exacerbated the inefficiencies of weaker health systems, affecting not only the prevention and treatment of communicable and non-communicable diseases, but also the delivery of primary health care (PHC) services: health education, prevention and control of infectious diseases, childhood vaccination campaigns, prenatal care, etc. (Pillay et al. 2021) In some countries, decisions taken during the pandemic were not science-based, but rather politically biased or based on misinformation about treatments. For example, in Brazil the federal administration recommended drugs that were non-effective for COVID-19 treatment, even after the WHO had denied their effectiveness (World Health Organization 2020). They also refused social isolation and mask usage (World Health Organization 2020; Ferigato et al. 2020; Médecins Sans Frontières 2021).

The pandemic also contributed to a sharp understanding of the PHC mechanics within the healthcare engine, not only as an important interface for basic community service provisioning but also as a surveillance and predictive reporting frontline for the whole healthcare system. Taking the above into account, the digital transformation could be a game changer for healthcare, especially in PHC: the economies of scale associated with a clear picture of end-to-end processes and the latest technologies could result in a state-of-the-art, frictionless, patient-centered care system. In particular, digital transformation could have a revolutionizing impact on PHC provisioning in LMICs (Schwarz et al. 2020). Through leveraging digital opportunities, LMICs could focus on universal accessibility and improved service quality, rather than bureaucratic and, sometimes, erroneous decisions.

It is also important to note that the PHC investment from the public purse has become a mounting priority in the context of introducing Universal Health Coverage in LMICs (Garg et al. 2021; Tilley-Gyado et al. 2016). In the long run this brings additional competitive benefits for the LMICs: the digital transformation could start from the edges (PHC), i.e., closer to the community practice, migrating later to the central and more consolidated secondary and tertiary care facilities.

From a patient perspective, it also makes sense to have access to a more technologically advanced PHC. A recent US report shows that the average cost of patient visits to emergency rooms (ER) was USD 2143 in 2020, up from USD 1704 in 2016. This reflects the increase in healthcare provision for secondary and tertiary care, as emergency functions have been concentrated within fewer, consolidated facilities (Healthcare Cost Institute (HCCI) 2020). ER crowding is also considered a marker of hospital health, indicating a failure to provide adequate PHC (Barish et al. 2012). While these figures are informative and presumed to be reflective of an overall trend observed within healthcare, comparable data from LMICs is missing on a consistent basis.

2 Digital Transformation: The Technologies Behind the Revolution in the Health Sector

Digital transformation brings a common framework that increases efficiency, productivity, and overall user experience (Bitton et al. 2017; Nambiar et al. 2020). In healthcare systems, it streamlines processes and automates routine tasks, resulting in a major reduction in operating costs (OPEX) while in certain cases it can also improve patient outcomes (Bardhan and Thouin 2013). The so-called ‘Iron Triangle dilemma in Healthcare’ (Callahan 2014), a challenge faced by all health systems around the globe, could be completely transformed in the long run through digital technologies, as accessibility, quality, and affordability (the three cornerstones of the ‘Iron Triangle’) could all be provided without compromising patient outcomes and well-being. However, an adequate implementation of a digital health system must integrate several digital technologies under the same framework, not as isolated technology silos, but rather as a global puzzle, where ideally every piece counts for solving a local demand, and success relies on the seamless integration of all the constituent pieces.

The key technologies for digital transformation in healthcare systems are the Internet of Things (IoT), Artificial Intelligence (AI), Blockchain, and Cloud. These will be reviewed in the following sections in relation to their implementation within healthcare in LMICs. This is a narrative review, aiming to identify key published information and provide an overview of the field, rather than a more detailed view on any aspect which should be addressed by a systematic review methodology. Manuscripts were identified for being seminal in their respective technical sections, while additional manuscripts were identified through ‘snowballing’, i.e., using reverse citation tracking to find articles that cited articles already deemed relevant to the review (Callahan 2014).

2.1 Internet of Things

The Internet of Things (IoT) comprises devices, appliances, and all types of equipment that have built-in sensors, software, and network connectivity (Xia et al. 2012). In healthcare, IoT plays an essential part in digital healthcare systems as it brings the PHC closer to patients with the potential of creating a so-called “Virtual Healthcare space”. IoTs are commonly used for monitoring patients’ health via wearable devices, collecting data in real time and transmitting the information to the cloud for further processing and analysis. This is particularly relevant for routine monitoring of vital signs in patients, such as heart rate, blood pressure, body temperature, glucose, and blood oxygenation levels. Medical teams could receive real-time alerts from patients that need to be monitored (e.g., post-operatively), considerably improving patient assistance and prioritizing urgent cases as they emerge. For patients, this could mean a reduction in unnecessary visits to tertiary healthcare services such as Emergency Response (ER). A recent European Union report described a patient monitoring solution implemented in Chile that reduced ER visits by 42%, resulting in a 50% saving to insurance companies (Xia et al. 2012). A second study on a targeted implementation, showed a three-fold increase in the risk of acute infection for elderly people after a visit to ER (Andersen 1994).

It is important to note that significant technical challenges remain that could be addressed through further sensor miniaturization and an increase in sensor efficiency, resulting in higher processing power and decreased power consumption (Kim et al. 2019). The resulting wearable devices will be lighter, more precise, and with new functionalities, favoring overall patient experience and treatment outcomes. Additionally, there are infrastructural challenges—especially in LMICs—as the IoT relies on the seamless integration of several technical components all working together. The lack of consistent technical infrastructure to support IoT in LMICs remains a major barrier in introducing these technologies beyond a limited number of centers of excellence. It is also necessary to consider that, for LMICs, different types of infrastructure/infrastructure connections may be necessary to produce the intended results (Dinh et al. 2020).

2.2 Artificial Intelligence

Artificial Intelligence (AI) provides the most disruptive element for a complete transformation in healthcare. AI refers to a field of computer science that accentuates the creation of intelligent machines that mimic human behavior through interconnected algorithms that are designed to analyze and process data, recognize patterns and relationships in the data, and make predictions or decisions based on such data (Dinh et al. 2020). Therefore, AI algorithms rely on long-term knowledge (disease-specific datasets) that create a clear understanding of the disease and minimize the risk of wrong decisions. As such, the positive impact of AI implementations is correlated to the quality and quantity of these datasets, the understanding of existing clinical workflows and intended outcomes, and the representativeness of the target population (de Hond et al. 2022). However, the general goal of AI is not well-defined because there is no consensus on what specifically constitutes ‘intelligence’.

From a long-term economic perspective, AI will drive down the costs of high-volume, repetitive tasks in healthcare and is therefore anticipated to have a major impact on healthcare economics at a macroeconomic level. Since AI implementation may also improve the early diagnosis of diseases, treatment could be simpler, less invasive, and potentially, with increased success rates. It therefore makes complete sense to bring AI to the frontline of PHC. A recent report from the World Bank describes the notable healthcare achievements in China due to increased investments in PHC, including information systems and in response to the intense population pressures. Even though the report does not focus on specific technologies, the improvements are potentially related to an integrated technological framework, including IoT and AI (European Institute of Innovation and Technology 2021).

AI could be applied to clinical notes in PHC Electronic Health Records (EHRs) for predictive analytics. For example, the US National Institute on Aging is funding AI research for detecting early stages of Alzheimer’s disease based on the analysis of EHRs of PHC, as some physicians might not be specialized in identifying the potential disease symptoms (Quach et al. 2012). Furthermore, the prediction of adverse neonatal outcomes in newborns based on deep learning models that use EHRs data for different early-life stages, ranging from preconception to a few months after birth, is another relevant use of AI in PHC records (World Bank 2022). Using AI virtual assistants and chatbots brings other possibilities in PHC, especially in LMICs where telemedicine is very relevant due to the acute shortage of health professionals (Naseem et al. 2020). For example, in an ideal scenario, by using a pre-defined questionnaire, it would be possible to understand a patient’s symptoms and reach a preliminary diagnosis. Subsequently, the virtual assistant could connect the patient to a human doctor or, depending on the case severity, recommend a hospital nearby. Several implementations of AI virtual assistants have been deployed globally, reducing diagnosis delivery times, and minimizing human errors (European Institute of Innovation and Technology 2021).

A recent report from the World Bank Group regarding PPR addresses the weakness of global disease surveillance networks during the COVID-19 pandemic (Science and Enterprise 2023). AI algorithms could be used for identifying changes in patient disease profiles arriving in the PHC. This would have a major impact on the response of regional authorities, isolating the regional source of disease before it spreads to other regions or globally. This implies that AI for the PHC must be observed both as a global strategy, as well as a country-based decision. Thus, a common global AI framework for PHC should be considered and further discussed as a global public healthcare priority.

Finally, an area with enormous potential to benefit from AI is imaging processing and analysis, such as X-rays and Magnetic Resonance Imaging (MRI). For example, it has been argued that AI can substantially streamline radiologists’ work while improving the detection of breast cancer (De Francesco et al. 2023). The approach of using AI on imaging can have a dramatic impact on several diseases that affect the aging population, including cancer, cardiovascular and pulmonary diseases (World Bank 2023). Looking further into the future, if combined with other technologies, AI could further increase its overall impact. For example, CRISPR-Casp9 gene-editing tools (Leibig et al. 2022) which have revolutionized genome editing, could be combined with AI algorithms to automate genomic editing procedures. The gene selection process could also be streamlined, as specifically-designed algorithms could be used to identify diseases in stored patient biological samples (e.g., blood, urine, etc.) kept within biobanks (https://www.arterys.com/clinical-evidence).

2.3 Blockchain

The technology of Blockchain can be described as an immutable record in which data entries are registered in a decentralized manner. This means that users or entities can interact without the presence of a central authority thus allowing more transparency around these interactions (Ledford and Callaway 2020). Cryptocurrencies are a good example of the use of blockchain technology. In this case the blockchain acts as a decentralized database that keeps track of all coin transactions. The continuously increasing number of data entries is then packaged together into blocks of data which are securely maintained within the blockchain by cryptographic protocols and cannot be tampered with (National Institutes of Health (NIH) 2023). Hence, the data security aspect is ensured, while the added-value argument beyond security remains to be strongly demonstrated.

Healthcare has the potential to benefit from the use of blockchain technology as it is a data- and personnel-intensive domain where the ability to access, edit and have trust in the data emerging from its activities is critical. In that context, blockchain could improve data management by connecting different systems and increasing the accuracy and security of electronic health records (EHRs) (Hasselgren et al. 2020; Hölbl et al. 2018). Moreover, this technology could be used in e-health applications, where patients and healthcare professionals are required to identify themselves, by allowing for an efficient digital identity management which is not possible with current internet protocols that were not originally designed for that purpose (Satybaldy et al. 2022). In the pharmaceutical industry, blockchain can help identify and avoid the dissemination of counterfeit and unapproved drugs, and it is possible to define smart contracts to automate the technical processes, improve supply chain management, and verify the quality of pharmaceutical products (Hölbl et al. 2018).

Blockchain also has the potential to increase transparency and integrity of data in the context of clinical trials by maintaining records of patient consents and clinical data that cannot be modified, therefore ensuring that the trials meet all relevant regulations and that problems of fraudulent results and removal of data by individuals is avoided (Bell et al. 2018). Despite this, blockchain technology is not yet widely used in healthcare in LMICs and the stated benefits remain in the sphere of future achievements or limited to specific users in high-income settings.

2.4 The Example of Non-Fungible Tokens (NFTs)

Health information is highly valued, especially as the implementation of big data and machine learning are increasingly considered in health care. Within this context, non-fungible tokens can help incentivize a more transparent, and efficient system for health information exchanges in which patients can participate in decisions about how and with whom their personal health information is shared, and where data access and control can be automated through smart contracts (Kostick-Quenet et al. 2022).

NFTs are created by uploading digital content on a blockchain and having other computers verify and timestamp the content, location, and owner of the content. NFTs are digital contracts composed of metadata to specify access rights to, and terms of exchange of a given content. They represent the point of access to digital content but are not the content itself, and they allow for its secure storage and sharing, for example medical health records, through the use of pseudonyms that maintain anonymity while ensuring transparency and accountability (Kostick-Quenet et al. 2022). NFTs are used in the entertainment (Regner et al. 2019) and commercial (Ali and Bagui 2021) sectors on platforms that provide collectibles, access keys or event tickets, therefore ensuring the uniqueness of the items exchanged and securing their ownership. However, their use in healthcare is not yet forthcoming, and the case of LMICs remains a distant future potential.

2.5 Cloud Computing

The fast development of the Internet of Things (IoT) technology, commonly used in medical settings to monitor patients’ vital signs through a wide range of devices, has greatly improved treatment and health outcomes for patients, but has also led to more stringent requirements for data analysis and data storage (Dang et al. 2019). The increasing amount of clinical, analytical laboratory and ‘-omics’ data due to the integration of IoT technology brings several challenges in terms of data storage, management, and sharing, as well as data confidentiality, security, and high-performance computing (Calabrese and Cannataro 2015). So far, cloud computing technology has been the preferred solution to address these issues by providing the ability for health professionals, and to a much lesser extent patients, to access shared medical data and other resources at any time and anywhere within a given digital environment, e.g., healthcare providing organizations (Griebel et al. 2015). In addition, with cloud computing, data sharing and storage can be performed at scale in a more structured and organized way with full transparency, thus minimizing the risk of data loss.

This technology provides access to higher computing power and storage capacities at a lower cost than using regular grid technology and so can improve the scalability of healthcare activities and resources (Dang et al. 2019). However, in the case of LMICs, the implementation of cloud computing remains piecemeal and limited only to specific clinical centers of excellence, typically belonging to tertiary healthcare (Clifford 2016). For example, in India, electronic medical record (EMR) systems in tertiary healthcare facilities are linked to a remote health cloud which allows for a direct entry of orders and notes, as well as desktop sharing since EMRs can be accessed from anywhere (Agrawal et al. 2013). In another instance, Zambia was able to set up a local data server that communicates directly with monitoring mobile devices on patients and the cloud. Skin-integrated sensors on patients collect physiological data that are encrypted and transmitted to a local server. The data is then securely transferred to the cloud for broader access and monitoring. Access to this data in the cloud can be granted to authorized individuals through a system of identifiers (Xu et al. 2021).

3 Challenges for the Implementation of Digital Health

The pandemic has demonstrated that even during times of extreme pressure on healthcare systems, it is practically impossible to establish a single global technological solution to a given problem. The acceptability and implementation of digital technologies during the pandemic were driven by context, depending on the different infrastructural, financial, societal, legal and ethical backgrounds of end-users. Digital technologies in healthcare are received and operate in very different ways when implemented in high-, medium-, or low-income countries, or when deployed in ‘individualistic’ versus ‘collectivist’ societies (Ferretti et al. 2020). The following paragraphs focus on the technical and design challenges of digital healthcare implementation in LMICs.

4 Technical Challenges

4.1 Infrastructure

The lack of infrastructure for healthcare in LMICs has been highlighted in numerous publications, such as the Lancet Oncology Commission (Ngwa et al. 2022) and many others. The availability of communication networks, electrical networks and equipment has been well documented and remains a major challenge for improving healthcare in resource-restricted settings worldwide. However, from a digital healthcare implementation viewpoint, additional aspects also have an influence. For example, in LMICs healthcare systems are often fragmented, with individual health units providing services that are not integrated into a national network and universal healthcare coverage is unavailable. Furthermore, it is likely that a parallel private system of individual health units exists, that caters for a different portion of the local population, i.e., those with higher income. Thus, there is very high fragmentation, with multiple smaller systems co-existing as in the case in Ecuador (Carlo 2020).

The investment and maintenance costs of the required infrastructures are prohibitive in settings where many critical and competing priorities exist. Suboptimal device use (including digital health applications) is directly linked to incomplete costing and inadequate consideration of connectivity (e.g., data transfer speed), maintenance services (including digital and physical infrastructure) and user training. The accurate estimation of life-cycle cost and careful consideration of device servicing are of crucial importance, however, there is currently no consensus approach for achieving this within LMIC settings (Diaconu et al. 2017).

4.2 Human Capital

The availability of technical infrastructure, equipment costs and past performance of similar equipment are the primary deciding factors in the procurement of medical devices (including digital technologies) in LMICs (Diaconu et al. 2017). However, maintenance services and user/staff training programs are often limited or even entirely absent in LMICs, leading to equipment under-performing and having a reduced lifespan and in some cases, unsafe device handling practices. It is estimated that 40–70% of medical devices in resource-restricted settings are either broken, unused or unfit for purpose, largely due to the absence of appropriately trained staff and preventive maintenance (Perry and Malkin 2011). This is further compounded in the case of digital healthcare applications, where training schemes for staff are less accessible, the time it takes to become fully trained can be considerable and the availability of expertise and post-training support is often limited (Browning et al. 2020).

4.3 Data Quantity, Quality, Representativeness

Introducing electronic health data systems in LMICs, as a first step towards a wider set of digital health applications, could improve data quality and efficiency in service delivery. A small number of studies have demonstrated that such health information systems can also provide small annual cost savings to the public health system (Krishnan et al. 2010; Fenenga and de Jager 2007). However, the questions on data quantity, quality and representativeness remain addressed only in isolated silos, e.g., for specific clinical trials or within individual tertiary healthcare units, and not as part of a wider healthcare system development. There are however notable exceptions with national or regional healthcare and system-wide investment (including in digital healthcare applications) in countries such as Tanzania (Vasudevan et al. 2020), Egypt (Noby 2022), Rwanda (Ippoliti et al. 2021), Mexico (Uc et al. 2020), Indonesia (Aisyah et al. 2022) and others. The recent G20 meeting in 2020 highlighted the need and value of digital health investment and development in LMICs, as described in the Riyadh Declaration (Knawy et al. 2020; Al Knawy et al. 2022). Therefore, an increase of investment in digital health applications is anticipated, which will eventually result in a better understanding of the data questions mentioned here.

5 Design Challenges

5.1 Design and Evaluation Frameworks for DHI (Digital Health Interventions)

The technical challenges described previously are the more visible aspects of the challenges facing digital health in LMICs. The less visible challenges are those relating to the design. Specifically, the objective of design as defined here, is not about the technology per se, but about the overall quality and configuration of service delivery that might result from the comprehensive adoption of a new technology (Holmlid 2007, 2009). Hence, the design challenge is to take a more holistic view, considering the impact of digital health applications as opposed to simply improving existing processes and workflows. Furthermore, a design perspective inherently acknowledges that technologies are not fixed and immutable (Barrett et al. 2015), but subject to cyclical revision and refinement based on emerging insights about their efficiency and effectiveness within the context(s) where they have been introduced (Nambisan 2013; McCool et al. 2020).

Digital health implementations are complex and can alter as the technology matures, often in LMICs in parallel with the healthcare system. One of the most significant issues influencing the effectiveness of such implementations concerns the existing evaluation frameworks. Previous research showed that there is a lack of knowledge related to the development of frameworks for the evaluation of such digital health implementations, which have to be equally as sensitive to the local context, acknowledging diverse technical, social and cultural perspectives and settings (Maar et al. 2017). The lack of evaluation frameworks and the rapid advancement of digital technology make it difficult to compare accessibility and affordability of digitally enabled healthcare across communities, within and between countries, in LMICs. For this reason, published evaluations for digital health implementation tend to be quite heterogeneous and the evidence concerning evaluating frameworks is inconsistent. In response to this need for a common framework a few recent LMIC-centered solutions have been proposed (Kowatsch et al. 2019; Wilkinson et al. 2023; Dodd et al. 2019; Nadhamuni et al. 2021; Marchal et al. 2010), however, these still remain to be extensively tested in the field.

5.2 Implementation Barriers

ROI (return of investment) definition: As a result of the above mentioned barriers to the implementation and sustainability of digital health implementations, there are a limited number of successful case studies that went beyond the pilot or feasibility stage. In some fields, the results are mixed, or the existing studies can only demonstrate impact in the short-term (Marcolino et al. 2018; Aisyah et al. 2020). Thus, understanding the framework by which the tool generates value for the healthcare system is important as a core element in framing the return of investment (ROI) argument. In LMICs, where multiple acutely competing priorities are considered for funding, the ROI is critical. It should be clear which priority digital health applications address, the capital expense as well as the operational expense (the latter incurring costs in perpetuity), as well as staffing requirements, technical support, maintenance, and hosting. Aligning these costs with the evaluation framework and ROI, would provide a transparent and holistic understanding of the costs and requirements of the digital health application and remove any implementation barriers due to hidden costs that such applications or singular initiatives are often associated with (Chen et al. 2023; McCool et al. 2020).

Clinical value demonstration: A few parameters exist for which the value can be demonstrated for digital health applications. In this book, there are specific national examples from Vietnam, Egypt, and Saudi Arabia, demonstrating the plurality of options when measuring value. A meaningful approach to measuring value has been proposed by the Institute for Healthcare Improvement (IHI)‘s Quadruple Aim as: (a) improving the health of populations, (b) enhancing the experience of care for individuals, (c) reducing the per-capita cost of health care, and (d) improving the experience of clinicians and staff (Bodenheimer and Sinsky 2014). These are generic enough and fit most contexts, although their respective definitions and evaluations may still differ considerably. Evaluating a digital health application’s direct impact on an operational outcome—rather than a higher-level one—may be a better way to capture its value. For example, it may be difficult to measure “total staff time saved,” and easier instead to measure any “reduction in number of unnecessary appointments/consultations”. A successful digital health application is one that can stratify population groups and/or individuals along specific pathways, and where needed, trigger a human-led intervention. This is a measurable approach and one likely to be able to demonstrate clinical value.

6 Limitations

This narrative review has presented an overview of the different digital healthcare technologies, as well as describing their technical and design challenges, nevertheless it has some inherent limitations. The method of identifying the manuscripts is not exhaustive, and it is likely that some relevant publications have not been considered. For a more detailed view, a systematic review would need to be performed for each one of the sub-sections above individually. Furthermore, given the rapid pace of technological advancement, some of the anticipated outcomes may materialize earlier than predicted.

7 Conclusion

The advancement of digital technologies in healthcare is not a new phenomenon, however it was accelerated by the COVID-19 pandemic when healthcare needs across all settings forced institutions to consider the inclusion of digital health applications in their routine operations. The need for digital healthcare applications to deliver solutions is greatest in LMICs and will continue to be so in the near future. This chapter has presented an overview of the technologies driving the digital transformation of healthcare, including IoT, NFT, Blockchain, cloud computing and AI. The challenges to the implementation of digital healthcare applications were also presented (infrastructure, human capital and data quality), with a particular focus on the design and evaluation aspects.