Introduction

The burgeoning field of digital health represents a fertile frontier for the application of rapidly evolving digital technologies. Healthcare analytics have become increasingly powerful, where large language models (e.g., ChatGPT) can now answer patients’ questions about gastrointestinal (GI) conditions [1], and machine learning-based models can predict personalized glycemic responses to certain foods [2]. There has also been a proliferation in health and wellness apps, health-monitoring wearables, and health data tools and services. With the integration of real-time multimodal data capture (e.g., heart rate, physical activity, blood oxygen saturation) and advanced data analytics that can distill large volumes of health data into meaningful feedback, mobile devices have become sophisticated healthcare tools readily accessible to the consumer. These capabilities also lend well to nutrition-related health. The incredible promise and possibilities of these digital technologies for clinical nutrition are expansive. This review discusses potential applications of digital health, their implications for patient care, and future directions at the intersection between nutrition for GI conditions.

Mobile Applications

Common features of nutrition-focused mobile applications (apps) include symptom monitoring, recording of food intake, meal planning, health education, and communication with the care team or support groups [3, 4]. Monitoring symptoms and tracking potential food triggers are particularly valuable for patients with GI conditions, as diet can strongly influence symptoms and quality of life [5,6,7]. For example, the FoodMaestro app helps users follow a low FODMAP (fermentable oligo-, di-, monosaccharides, and polyols) diet by providing lists of suitable foods and helping track dietary triggers. More than half of app users reported symptom improvement following FODMAP restriction [4]. MyIBDCoach is another mobile app that provides a nutritional assessment and other features for disease monitoring. In a randomized controlled trial (RCT) of 909 patients with inflammatory bowel disease (IBD), app users had significantly fewer outpatient visits and hospitalizations, although there was no difference in numbers of flares [8, 9].

Many nutrition mobile apps assist with diet assessment by interpreting food intake and serving sizes to calculate calorie count [10]. Additionally, some nutrition apps are equipped with a barcode scanning feature for packaged food to assist users in instantly checking product components and tailoring their diets. For example, a RCT involving 66 participants with cardiovascular disease found the salt-reduction SaltSwitch app to significantly reduce purchase of salt [11]. Another study demonstrated that the use of the SwapSHOP app led to lower sugar and saturated fatty acid consumption [12]. Moreover, food scanning apps, such as the Change4Life Food Scanner, have been associated with a mean reduction in healthcare costs and workplace productivity losses [10]. Similarly, numerous GI nutrition apps, such as Fig and Spoonful: Food Scanner, have been assisting users with celiac disease, specific diet restrictions, or food intolerance through a barcode scanning feature and a variety of nutritious recipes [13, 14]. However, there are no RCTs that specifically evaluate the impact of the barcode scanning function of GI nutrition apps on users’ health. These features have nonetheless empowered users with a vast amount of information at the tip of their fingers, so they can make smarter and safer choices in their diet.

Mobile apps can also serve as a platform for communication with clinicians and support groups. For example, users of TECCU (Telemonitoring of Crohn’s Disease and Ulcerative Colitis), a mobile app targeting users with IBD, are able to communicate with their health team through the platform. In a 3-arm RCT of 63 patients with IBD, app-monitored patients had more improvement in disease activity and achievement of remission than the groups that received nurse-assisted telephone care or standard in-person visits [15]. Moreover, users of HealthPROMISE, another mobile app for IBD patients, are able to directly message their care team and follow their treatment plan using the platform. A study has shown that app users reported more equitable decision-making and improved quality of life and care [16]. These features have helped decrease the number of outpatient clinic visits and hospitalizations [8, 9]. In addition, the apps provide a social and educational platform where users communicate with each other and learn about their conditions. Table 1 lists examples of currently available commercial mobile apps for different GI conditions. However, it is important to recognize that the clinical significance or utility of most apps remains unstudied.

Table 1 Examples of Currently Available Mobile Apps for Various GI Conditions

Wearable Devices

Remote sensor technologies provide a means for continuous real-time data collection. Sensors can be embedded in wearable, implantable, or ingestible devices. VitalPatch, a multimodal wearable, was used to demonstrate increased heart rate variability prior to ulcerative colitis flares [17]. EnLisense, a sweat-sensing device, continuously monitors users’ interleukin (IL)-1 beta and C-reactive protein concentrations in sweat to identify IBD flares [18]. Wearable technologies also offer great opportunities for nutrition monitoring through pressure, auditory, visual, motion, and metabolic biomarker sensors. Patients with colorectal or gastric GI conditions often suffer from suboptimal calorie intake due to symptoms that hinder oral intake or iatrogenic causes due to certain medications or surgical interventions. Only 25% of hospitalized patients receive sufficient calories, and 20% are at risk of malnutrition [19]. However, overnutrition has also been a challenge with patients, such as those with metabolic dysfunction-associated steatotic liver disease (MASLD). To address these nutritional challenges, there has been a great effort to solve the challenges of appropriate nutrition intake using wearables.

Food Intake Monitoring Technologies

A novel approach to measuring food intake involves a force sensor embedded in a tablecloth or underneath a table. Researchers have demonstrated that a dining table augmented with two layers of weight and radio frequency identification (RFID) antenna sensor surfaces can track food consumption by tracking change of the food content weight and detecting movement patterns of food containers or plates that are labeled by the RFID tags. Despite this approach not being wearable, the weight-RFID matching algorithm could measure the amount of food consumed by each person at the table with an accuracy of approximately 80% [20]. Acoustic-based food intake wearable devices utilize sounds produced from chewing and swallowing events to predict the amount of food that is ingested into the mouth with each bite taken. A study examined 504 habitual bites of eight healthy participants via an ear-pad chewing sound sensor and was able to demonstrate an average accuracy of 94% of the food classification using sound-based chewing recognition. However, the study was limited to the idea that acoustic chewing recordings were only applicable to solid foods [21]. The eButton is an example of a visual approach that utilizes a wearable camera embedded in a button. It captures images of food in front and estimates the calories and nutrients of the meal through a linked dietary database [19].

Other studies have also developed dietary assessment systems to detect food portions and quantitative nutrition information through a three-dimensional (3D) reconstruction utilizing a camera function on smartphones [22,23,24]. They first measured volume estimation before and after a meal via 3D reconstruction then calculated the consumed nutrient intake based on the United States Department of Agriculture Food and Nutrient Database for Dietary Studies. The advancement of artificial intelligence (AI) has markedly improved image-based food recognition (see section on AI). There has been an effort to analyze dietary intake by monitoring the motion of the eating process, such as wrist motion. Gyroscope-equipped wearables that track wrist motion had 94% sensitivity in detecting bites in a controlled meal setting and 86% sensitivity in an uncontrolled setting [25]. Another innovative method is utilizing accelerometers embedded in wrist-worn and head-mounted devices, such as an eyewear frame. Using two accelerometers simultaneously, one for hand to mouth motion and another for head motion from chewing, achieved 89.5%-95.1% of accuracy in detecting eating duration [26, 27]. Furthermore, electromyography expanded the field of automatic dietary assessment by measuring physiological changes [28]. Motion-induced electrical impedance between the electrodes measures skeletal muscle activity to sense chewing and biting events [29]. Similarly, piezoelectric devices measure the voltage generated in response to mechanical stress in chewing solid food [30,31,32].

Despite the current utilization and growing potential of wearable devices, challenges still exist in wearable technology for nutrition monitoring. It is noteworthy that most of the approaches have only been studied in laboratory settings, and investigation of real-life applications are needed. For example, devices that use sound as signals will need to demonstrate their utility in environments with background noises. Improving the accuracy of wearable devices is also a serious obstacle, especially considering the decreased accuracy of the signal processing scheme with the greater number of foods. For instance, using acoustic chewing recordings, researchers were able to classify between three types of food with 94% accuracy; however, the accuracy decreased to 87% when the number of foods increased to four [21]. Lastly, enhancing the comfort level of the wearables and the security of continuously monitored data are also areas for further development.

Nutrients Monitoring Technologies

Wearable biochemical sensors feature the ability to non-invasively provide nutritional information at the molecular level. Integrated with electrochemical assays, wearable devices have been introduced to measure macro- and micronutrients. Multiple studies have shown success in tracking glucose levels in interstitial fluid and sweat [33,34,35]. A novel flexible, tattoo-like iontophoretic system was able to measure the interstitial fluid glucose level with the same trend as the blood glucose level [33]. Similarly, microfluidic sweat-sensing devices have been able to detect a rise in sweat glucose concentrations after meals [36, 37]. Scientists have also been able to measure tyrosine in sweat using a laser-engraved multimodal wearable sweat sensor capable of efficient and sensitive molecular sensing of microfluidic sweat sampling. The wearable is capable of detecting higher tyrosine levels in sweat after a high-protein diet [38]. A patch sensor has also been developed to measure vitamin C concentrations in sweat, taking advantage of vitamin C’s electron transfer capability [39]. The sensor has been shown to reliably track vitamin C concentrations in sweat, correlated to blood vitamin C levels [40]. A fabricated sensor on the nose bridge of eyeglasses was able to demonstrate a stable rise in vitamin B2 and B6 in tears by applying square wave voltammetry, the electrochemical technique that measures redox activity of electroactive molecules [41, 42].

Gastrointestinal Monitoring Technologies

An electrogastrogram (EGG) is a non-invasive device that continuously measures gastric electrical activity, analogous to an electrocardiogram. By monitoring GI smooth muscle contractions initiated and coordinated by underlying rhythmic bioelectric partners, an EGG can provide useful information to clinicians and patients for managing disorders such as functional dyspepsia, gastroparesis, and IBS [43, 44]. G-Tech Medical uses a wireless patch system to acquire gastric myoelectrical activity and has been successful in identifying patients who might be at increased risk of developing delayed gastric emptying after pancreaticoduodenectomy. It demonstrated that decreased gastric myoelectrical activity is associated with delayed regular diet tolerance. Additionally, preliminary studies have shown potential benefit of acoustic gastrointestinal biosensors for monitoring postoperative ileus and delayed gastric emptying following abdominal surgery. An embedded microphone adheres to the abdominal wall and monitors bowel sounds via vibration and sound signals [45]. These monitoring technologies can potentially assist clinicians in identifying individuals who might have difficulty advancing their diet after surgery and allowing for individualized feeding regimens [46].

Ingestible sensors have also demonstrated their value in various clinical applications. They settle on the GI mucosal lining and monitor pH, pressure, temperature, and lumenal contents, providing valuable data on ingestion and motility [47]. Several studies have shown the advantages of ingestible electronic devices that are equipped with probiotic biosensors. By providing information on gasses of gut microbiota, they allow for the advance of diagnostic and monitoring tools, especially of intestinal response to dietary changes for patients with diverse GI conditions. These features, coupled with the aforementioned home tests, could potentially advance the sophistication of personalized diets [48,49,50,51].

Artificial Intelligence

AI is a technology that enables computers to mimic human cognitive processes, like learning, problem-solving, and self-correction [52]. Machine learning (ML) is a subset of AI that applies algorithms to learn from data and applies the learned knowledge to new data without explicit instruction (Fig. 1A) [53]. In supervised learning, a model learns from pre-labeled data to make predictions or classifications on unseen data, which forms the basis of most ML applications [52]. On the other hand, unsupervised learning utilizes unlabeled data to allow a model to autonomously identify commonalities within datasets and form clusters, which is particularly valuable for researchers exploring unknown patterns within data while leveraging existing knowledge [53]. Semi-supervised learning uses both labeled and unlabeled data and is particularly beneficial in scenarios where accessing labeled data is challenging, yet acquiring unlabeled data is feasible [54]. Deep learning (DL) is a subset of ML that uses neural networks as its backbone [55]. The neural network architecture resembles the human brain, where each node represents a neuron and passes signals to the next layer once it is activated (Fig. 1B). The flexibility and depth of these layers allow DL models to process large datasets and perform more complex tasks such as image recognition and natural language processing [56]. AI has clinical relevance for GI nutrition in four main areas: diagnostic assistance, risk prediction, personalized nutrition, and dietary assessment [53, 57].

Fig. 1
figure 1

A Subcategories of artificial intelligence, B. Deep learning model architecture

Diagnostic Assistance

A diagnostic area of AI research in GI nutrition could involve facilitating prompt malnutrition screening. Due to GI symptoms and fear of food triggers, individuals with GI issues are vulnerable to malnutrition and micronutrient deficiencies, which may exacerbate their conditions. A timely detection and intervention of malnutrition in patients with GI issues, such as IBD, is associated with improved clinical outcomes [58]. In today’s practice, several nutritional assessment tools are used for malnutrition screening, including the Malnutrition Universal Screening Tool (MUST), Nutritional Risk Screening 2002, and Malnutrition Screening Tool. However, these tools may underreport malnutrition cases due to the limited variables they assess [59]. With AI’s capacity to process large and complex datasets, AI-based malnutrition screening tools could incorporate various data sources to achieve a more timely and accurate outcome. Timsina et al. developed an ML model named MUST-Plus trained by a broad array of clinical data, including physiologic data, lab results, electrocardiogram readings, and more, to predict malnutrition in adult patients admitted at a large tertiary hospital between January 2017 and July 2018, its performance was also compared with the classic MUST nutritional assessment tool [59]. The results showed that the MUST-Plus outperformed the classic MUST by 30% higher sensitivity, 6% higher specificity, and 17% higher area under the receiver-operating characteristic curve. In the future, similar models could also be employed and evaluated within patients with GI issues. In addition, malnutrition signs can be detected during a nutrition-focused physical exam, such as fat loss in the orbital region, muscle loss in the temple region, and other abnormal appearances, including cyanosis, red scaly rash, clubbing nails, and sparse hair [60]. Given AI’s impressive performance in GI diagnosis through endoscopic image analysis (e.g., polyps detection, diagnosis of celiac disease, and IBD severity assessment), future research could also explore developing AI-based malnutrition screening tools that analyze GI patients’ facial or full-body images [53].

Risk Prediction

Diet is a well-recognized contributor to the development of many chronic diseases, such as cardiovascular disease and type 2 diabetes [61]. Research suggests that the overall dietary pattern has a greater influence on health outcomes compared with individual food items, largely due to the synergistic effects of foods [62]. However, when studying the diet-disease associations, the traditional statistical methods usually focus on single nutrients and food items or use composite nutrition indices, including the Alternate Healthy Eating Index, the Mediterranean Diet Score, and the Dietary Approaches to Stop Hypertension diet score to represent dietary patterns, which may underestimate nutrition’s impact on health outcomes [63]. With the advancement of AI technology capable of processing large datasets, it becomes feasible to incorporate comprehensive nutrition data (e.g., all the macronutrients and micronutrients in a 24-h dietary recall) when investigating the diet-disease relationships [64]. Panaretos et al. conducted a study to compare the performance of traditional statistical methods and ML models in investigating the association between dietary patterns and 10-year cardiometabolic risk [63]. Data from 2020 participants in the ATTICA study (2001–2002 and 2011–2012) were used for the model training and testing. The ML models outperformed linear regression, suggesting that AI could offer a novel and more reliable approach to nutritional epidemiology. Additionally, Rigdon and Basu conducted a study comparing the amount and type of nutritional input data on an ML model’s ability to predict cardiovascular mortality risk [64]. In this study, complete nutrition data from the National Health and Nutrition Examination Survey (NHANES) dietary recall was compared with composite nutrition indices, including the Healthy Eating Index, the Alternate Healthy Eating Index, the Mediterranean Diet Score, and the Dietary Approaches to Stop Hypertension diet score. Six waves of data from NHANES (1999–2000, 2001–2002, 2003–2004, 2005–2006, 2007–2008, and 2009–2010) were used for the model training. Results showed that there was no significant improvement when adding the composite nutrition indices, while the prediction ability significantly improved when the complete nutrition data were added into ML models. Currently, in clinical practice, the use of nutritional indices is preferred over the analysis of raw nutrition data in nutritional risk assessment. In the future, AI models may reveal more intricate relationships between nutrition and disease development by leveraging comprehensive nutritional data, thus enhancing risk prediction accuracy.

Personalized Nutrition

With the recent development of next-generation sequencing and increased interest in the impact of the microbiome on gut health, there has been growing attention to personalized nutrition. It entails tailoring nutritional advice by taking into account a multitude of personal data, such as genetics/biomarkers, physical activity/stress levels, sleep patterns, and diet [65]. Multiple studies have shown that the gut microbiome is extensively associated with disorders of gut-brain interaction, such as irritable bowel syndrome (IBS), functional dyspepsia, and functional constipation [66,67,68,69,70,71]. Hence, more direct-to-consumer products, such as wearable blood sugar sensors, breath testers, and stool collection kits, have been available on the market. They are often integrated into commercial mobile apps and provide users with insight about their gut health, while also providing personalized recommendations However, independent and well-designed studies are still needed to validate the utility of these suites of products.

Given the cost-effectiveness and impact on the growth of particular microbial species, various diets have been formulated to manage IBS symptoms [72, 73]. However, unfortunately, the outcomes of those dietary interventions are often inconsistent [73]. One assumption is the uniqueness of individuals’ gut microbiota composition and the individual-specific responses to the diet, suggesting that one generic dietary intervention may not work for all and precision nutrition tailored to each person’s gut microbiome profile is recommended [72]. Additionally, a retrospective study demonstrated that gut microbiome taxa are a stronger predictor of disorders of gut-brain interaction status than the genomic biomarkers [74]. However, with traditional computational approaches, there is a limit to to process extensive microbiota data to personalize diets; thus, many diets were designed by either avoiding common food triggers or containing nutrients that can feed the beneficial taxa, such as gluten-free and dairy-free diets, low FODMAP diet, low inflammatory diet, and high fiber diet [73]. Meydan et al. conducted a metagenome-guided study to modulate participants’ gut microbiota composition using interventions including a gluten-free and dairy-free diet; the interventions were not associated with significant symptomatic improvement [75]. On the other hand, in a study by Karakan et al., ML-designed diets based on the individual’s distinct gut microbiome profiles were found to be superior to standard IBS diets for improving symptoms [76]. With the advent of AI technology that can process vast amounts of data, personalized diets formulated based on an individual’s distinct gut microbiota composition become feasible. This approach holds promise not only for managing IBS but also for addressing a broader range of GI conditions.

Dietary Assessment

As the initial step of nutrition therapy, dietary assessment lays the foundation for nutritional diagnosis and subsequent intervention. For patients with GI issues, such as IBD or IBS, periodic dietary assessment is necessary since they are more prone to malnutrition or macro- and micronutrient deficiency [77]. An accurate dietary assessment is crucial for ensuring prompt and effective nutritional intervention. Current assessment methods primarily rely on questionnaires, including the 24-h dietary recall, food frequency questionnaire, and diet records, that are subject to the individual’s memory and estimation of portion sizes and are susceptible to bias. Moreover, manually recording dietary items and portions can be tedious and burdensome, potentially leading patients to avoid certain foods intentionally for easier recording and exacerbating their health conditions. With the development of AI, especially the impressive performance of DL models in image classification and its ability to handle large datasets, several researchers have explored the potential to develop an objective and accurate tool to assist dietary assessment better. Arslan et al. developed a DL model for automatic food recognition and tested its performance with a public 12,740-food image database: UEC Food-100 [78]. The DL model outperformed other previously tested models with best-shot and average 5-trial accuracies of 90.0% and 88.9%, respectively.

Liu et al. introduced a multiple-dish food recognition DL model [79]. Setting itself apart from other models, this system is designed to identify not only single-food or single-dish images but also multiple-dish images [80, 81]. The researchers constructed a novel database comprising images of Taiwanese cuisine sourced from local eateries to train and evaluate their model. Results showed an accuracy of 87% and a mean average precision of 90%. When assessed with an open dataset of Indian cuisine, the model maintained an accuracy of 80%, suggesting consistent performance across different datasets. Lu et al. went further to develop an AI-based system called goFOOD™, which not only identifies foods from a meal image but also estimates the food items’ volumes, calories, and macronutrients using a nutrient database [82]. In goFOOD™, the dietary content is analyzed through two pictures taken from different angles, 90° and 75° from the table’s plane. After the food items are recognized and their volumes are decided, each item’s nutrient information is retrieved from the Nutritionix Database, and the whole meal’s calories and macronutrient amounts are provided. The model was tested with MADiMa and Fast food databases and compared with two experienced dietitians for nutrient estimation. The results showed that the dietitians did not significantly outperform the goFOOD™ with the MADiMa dataset.

Several other studies have been conducted to develop diverse DL models for automatic food recognition and volume estimation with an accuracy ranging from roughly 60% to 90% [81, 83]. Despite the promising results, the current volume of research in the field of food recognition is limited due to several obstacles: a lack of big and high-quality food datasets, the same food can exhibit different forms, combined ingredients are hard to detect from images, and intra- (similar foods with different appearances due to different conditions) and interclass (different foods with similar look) variances of foods [78, 84]. However, while a fully automatic dietary assessment tool may be hard to achieve in the foreseeable future, with ongoing efforts in constructing food datasets and enhancing AI algorithms, an assisting tool that is more objective and accurate can be expected, from which GI patients can benefit more prompt, accurate, and less time-consuming dietary assessments. In addition, it may help dietitians identify the food triggers and determine the alternative foods. Furthermore, the enhanced accuracy of dietary assessments and records can contribute to a high-quality nutritional dataset, which, in turn, can be utilized in other applications such as risk prediction.

Telehealth in Gastroenterology and Nutrition

Telehealth, or telemedicine, is the use of telecommunication technologies and electronic information to provide or extend care when the patient and the clinician are not physically in the same place at the same time [85]. Telehealth includes both synchronous and asynchronous communications [86, 87]. Examples of synchronous telehealth activities are live or “real-time” videoconferencing and audio-only phone communications [86]. Examples of asynchronous telehealth activities are sending messages via healthcare apps and remote monitoring using digital technologies such as wearable devices [86]. Because of the rapid adoption of telehealth during the coronavirus disease 2019 (COVID-19) pandemic, telehealth has become a mainstream treatment modality. However, best practices for telehealth are currently undefined and non-standardized [5].

While most patients perceive the quality of virtual care to be similar to that of in-person care, the main concern regarding telehealth has been the lack of physical examinations [88, 89]. Adaptations and strategies have been proposed to effectively perform a nutrition-focused physical exam via telehealth [90, 91]. For example, a nutrition-focused physical exam performed in a videoconference requires the clinician to ask pointed, probing questions along with verbal instructions for effective visual inspection of body assessment sites. It has been suggested that telehealth may be better suited for return or follow-up visits, especially for patients who had an initial in-person visit with their care providers.

It has been proposed that telehealth and remote patient monitoring could help ease financial burdens for patients with GI diseases [92]. Remote technologies remove the need for missed time at work and cost of travel to sustained clinic visits over time. Remote patient monitoring frequently uses web and mobile applications, with common features including symptoms tracking, medication logs/reminders, dietary intake records, and provision of patient education information. In a review of studies examining various telehealth models that included remote patient monitoring features in treatment of IBD, authors found that the use of web-based applications improved patient-reported quality of life, medication adherence, and decreased patient healthcare costs [92]. While telehealth and remote technologies improve care and decrease healthcare costs for patients, the benefits for clinicians are not as well-defined. There are concerns regarding extra workload such as non-reimbursable work created by remote monitoring. In addition, the integration of monitoring data into electronic health records is inconsistent or non-existent, thus there is this inherent inefficiency with current telehealth systems [92]. Table 2 summarizes the advantages and disadvantages of telehealth care delivery in gastroenterology and nutrition care [5, 91,92,93,94,95,96,97].

Table 2 Telehealth use in gastroenterology: advantages and disadvantages [5, 89,90,91,92,93,94,95]

Policies regarding healthcare financing and payment models (especially those determined by the Centers for Medicare and Medicaid Services) strongly influence how care is delivered [87]. Currently, much of healthcare is paid using the fee-for-service reimbursement model where procedural services (e.g., colonoscopy) and direct patient-facing time are reimbursable [87, 98]. On the other hand, cognitive services (e.g., asynchronous work such as responding to patient messages and coordination of care) are either poorly reimbursed or not reimbursed at all [98]. For example, long-term impacts of financial policy changes have been seen in the progressive divestment in traditional nutrition support teams (typically made up of physician, nurse, dietitian, and pharmacist) over the decades [98, 99]. Patients with severe, chronic GI issues such as intestinal failure, require intravenous fluids and parenteral nutrition. Ideally, the multidisciplinary nutritional support team, often led by a gastroenterologist, would manage safely prescribing, compounding, delivering, monitoring, and adjusting the therapies. Because the management of nutrition support has few sources of direct revenue-generating activities and poor reimbursement rates, administrative backing for nutrition support teams has declined over the years, often resulting in restructuring, consolidating, or closing of service lines [98].

Potential Concerns with Digital Health

Balanced against the many benefits of digital health for GI nutrition are several potential drawbacks to consider. For one, mobile phones and applications give third-party organizations virtually unfettered access to our daily activities, preferences, and lives. Data harvesting practices, particularly by companies that rely on individualized data for targeted advertisements, have led to privacy concerns and multiple litigations surrounding data privacy [100]. As most health or nutrition applications are not considered entities covered or protected by the Health Insurance Portability and Accountability Act (HIPAA), users’ health information that is collected and stored by third parties are at increased risk of misuse, as well as intentional and unintentional access by others. There are additional questions of data ownership: who owns the health data collected by mobile developers, cloud-based servers, or companies that assay biospecimens as a non-clinical service (e.g., microbiome, genealogy)?

As powerful as AI-based technologies may be, there are ethical and practical risks of potential bias. The quality of AI models is highly dependent on the quality of the data used to train them. Training datasets that contain inherently biased data could lead to AI models that perpetuate or even amplify these biases. For example, if developers make biased assumptions regarding the prevalence of symptoms and ethnicity, then the AI algorithms would produce unfair and inaccurate results for the affected population. Besides race-, sex-, or culture-driven biases, these can include compromised generalizability related to other biological parameters (e.g., anthropometrics, nutrient intake, laboratory values). As noted by the United States Office of Minority Health, “healthcare algorithms and AI bias can contribute to existing health disparities for certain populations based on race, ethnicity, gender, age, or other demographic factors [101].” As new technologies are developed and implemented, novel problems, often ethical in nature, will arise. Healthcare providers using digital health technology should be aware of these concerns and consider the risks and benefits of these technology-driven treatment options.

Conclusions

Similar to how computers revolutionized the practice of medicine, advances in digital health have created incredible opportunities to disrupt care delivery in GI nutrition. The ubiquity of smartphones and other mobile devices provide widespread and direct access to patients. Mobile applications have evolved beyond passive sources of health information to highly sophisticated health tools that now enable interactive healthcare even independently of the clinician expert. Their capabilities are significantly enhanced by on-device or cloud-based AI, which confers devices with the ability to analyze very large and complex arrays of data to aid with diagnostics, risk prediction, and dietary assessments. Moreover, the advent of wearables with their continuous, multimodal, and real-time collection of health data (e.g., temperature, heart rhythm, glucose, sweat, activity, sleep) now offer unprecedented insight into the individual’s health and lifestyle. As digital health for GI nutrition continues to evolve, the future holds much promise in boosting healthcare access, enhancing patient care, improving clinical outcomes, and translating the concept of personalized nutrition into reality.