Secondary Logo

Journal Logo

Review Articles

A Gratifying Step forward for the Application of Artificial Intelligence in the Field of Endoscopy: A Narrative Review

Xu, Yixin MD*; Tan, Yulin MD*; Wang, Yibo MD*; Gao, Jie MD; Wu, Dapeng MD; Xu, Xuezhong MD*

Author Information
Surgical Laparoscopy, Endoscopy & Percutaneous Techniques: April 2021 - Volume 31 - Issue 2 - p 254-263
doi: 10.1097/SLE.0000000000000881
  • Open

Abstract

In 2018 alone, the global cancer statistics indicated there were over 3 million new cases and 2 million fatalities of gastrointestinal (GI) malignancies, including esophageal, gastric, and colorectal cancers.1 Based on the characteristics of cavity viscera, endoscopy constitutes the optimal diagnostic method for GI diseases.2 Due to the advancements in medical technology, numerous kinds of novel endoscopy have emerged, including magnifying narrow-band imaging endoscopy, magnifying chromoendoscopy, endocytoscopy, confocal endomicroscopy, laser-induced fluorescence spectroscopy, autofluorescence endoscopy, white light endoscopy, and others.3 Currently, endoscopy has been extensively used in clinical practice in gastroenterology.4 Earlier diagnosis results in a better prognosis. Therefore, the incidence of GI cancer rises, whereas mortality declines.5

Artificial intelligence (AI) was first introduced in 1956.6 AI involves the application of the computer system in the simulation and expansion of human intelligence. However, some limitations of traditional machine learning (ML), a means of achieving AI, including inefficient and incomplete feature extraction, still exist. Nevertheless, the advent of deep learning (DL) methods, for instance, convolutional neural network (CNN), has partially compensated these disadvantages.7,8 Compared with ML, CNN extract more details and features from the pixel level and adapt better to the complex clinical environment.8 The whole development of AI was shown in Figure 1. Recently, the application of CNN in endoscopy primarily constitutes computer-assisted detection (CADe) and diagnosis (CADx) systems used to assist endoscopists for polyp detection, early neoplasia detection, and Helicobacter pylori identification.9

FIGURE 1
FIGURE 1:
The development of artificial intelligence. Since an early flush of optimism in the 1950s, smaller subsets of artificial intelligence—first machine learning, then deep learning, a subset of machine learning–have created ever larger disruptions.

Despite the widespread use of endoscopy, the endoscopists are insufficient in quantity and quality. Poorly conducted endoscopy could result in missed detection of GI cancer.10 A considerable number of studies reviewed here revealed that the CAD system has high sensitivity and specificity in the diagnosis of GI diseases (Fig. 2). However, controversies still exist regarding whether the performance of AI is better than that of human endoscopists, and whether it can really improve the work efficiency of human endoscopists and alleviate their fatigue.

FIGURE 2
FIGURE 2:
The application of artificial intelligence in the field of endoscopy.

In this review, we assess the current status of application of AI in the field of endoscopy and provide perspectives of this technology.

SEARCH STRATEGY

Online databases (PubMed, Web of Science, and EMBASE) were used to search for relevant studies. The date ranged from 2000 to 2020. The following medical subject terms were used as keywords: artificial intelligence, computer-aided, machine learning, convolutional neural works, deep learning, and endoscopy. All included studies were within last 3 years.

In addition, we briefly listed all relevant studies about AI technology applied in endoscopy in Table 1.

TABLE 1 - Relevant Studies About AI Technology Applied in Endoscopy
Field References Country Model Disease Training Material Diagnostic Performance of AI System Diagnostic Performance of Human Endoscopists Increase of Diagnostic Accuracy after AI Training
Esophagus Sehgal et al11 UK ML BE Videos Accuracy: 92%; sensitivity: 97%; specificity: 88% Accuracy: 60%; sensitivity: 76%; specificity: 48% Accuracy: 66%; sensitivity: 83%; specificity: 54%
Ebigbo et al12 Germany CAD-DL BE Images Database1: sensitivity: 97%; specificity: 88%. Database2: sensitivity: 92%; specificity: 100% Database1: sensitivity: 76%; specificity: 80%. Database2: sensitivity: 99%; specificity: 78% NA
de Groof et al13 The Netherland DL Neoplasia in BE Images Database1: accuracy: 89%; sensitivity: 90%; specificity: 88%. Database2: accuracy: 88%; sensitivity: 93%; specificity: 83%. NA NA
Hashimoto et al14 Japan CNN Neoplasia in BE Images Accuracy: 95.4%; sensitivity: 96.4%; specificity: 94.2% NA NA
Ebigbo et al15 Germany CNN Cancer in BE Images Accuracy: 89.9%; sensitivity: 83.7%; specificity: 100.0% NA NA
Cai et al16 China CNN ESCC Images Accuracy: 91.4%; sensitivity: 97.8%; specificity: 85.4%; PPV: 86.4%; NPV: 97.6% Accuracy: 81.7%; sensitivity: 74.2%; specificity: 88.8%; PPV: 87.0%; NPV: 79.3% Accuracy: 91.1%; sensitivity: 89.2%; specificity: 92.9%; PPV: 92.3%; NPV: 90.4%
Everson et al17 UK CNN ESCC Images Accuracy: 93.7%; sensitivity: 89.3%; specificity: 98% NA NA
Nakagawa et al18 Japan CNN Invasion depth of SCC Images Accuracy: 91.0%; sensitivity: 90.1%; specificity: 95.8%; PPV: 99.2%; NPV: 63.9% Accuracy: 89.6%; sensitivity: 89.8%; specificity: 88.3%; PPV: 97.9%; NPV: 65.5% NA
Tokai et al19 Japan CNN Invasion depth of SCC Images Accuracy: 80.9%; sensitivity: 84.1%; specificity: 73.3% Accuracy: 73.5%; sensitivity: 78.8%; specificity: 61.7% NA
Guo et al20 China CAD ESCC Images/videos Images: accuracy: 98.9%; sensitivity: 98.04%; specificity: 95.03%. Videos: per-frame specificity: 99.9%; per-lesion sensitivity: 90.9% NA NA
Stomach Shichijo et al21 Japan CNN HP infection Images HP positive: accuracy: 80%; HP negative: accuracy: 84%; HP eradicated: accuracy: 48% HP positive: accuracy: 88.9%; HP negative: accuracy: 55.8%; HP eradicated: accuracy: 62.1% NA
Yasuda et al22 Japan CAD HP infection Images Accuracy: 87.6%; sensitivity: 90.4%; specificity: 85.7%; PPV: 80.9%; NPV: 93.1%. NA NA
Zheng et al23 China CNN HP infection Images Single image: accuracy: 84.5% sensitivity: 81.4%; specificity: 90.1%; multiple images: accuracy: 93.8%; sensitivity: 91.6%; specificity: 98.6% NA NA
Zhang et al24 China CNN Gastric polyp Images Small polyp: accuracy: 66.67%; Medium polyp: accuracy: 90.79%. Large polyp: accuracy: 85.71% NA NA
Yoon et al25 Korea CNN EGC and invasion depth Images AUROC: 0.851; sensitivity: 79.2%; specificity: 77.8%; PPV: 79.3%; NPV: 77.7% NA NA
Zhu et al26 China CNN Invasion depth Images Accuracy: 89.16%; sensitivity: 76.47%; specificity: 95.59%; PPV: 89.66%; NPV: 88.97% Accuracy: 71.49%; sensitivity: 87.80%; specificity: 63.31%; PPV: 55.86%; NPV: 91.01% NA
Small intestine Leenhardt et al27 USA CNN GIA CE images Sensitivity: 100%; specificity: 96%; PPV: 96%; NPV: 100% NA NA
Aoki et al28 Japan CNN Erosions and ulcerations CE images AUROC: 0.958; accuracy: 90.8%; sensitivity: 88.2%; specificity: 90.9% NA NA
Colorectum Chen et al29 China CNN Diminutive polyp Images Accuracy: 90.1%; sensitivity: 96.3%; specificity: 78.1%; PPV: 89.6%; NPV: 91.5% Expert1/2: accuracy: 90.5%/87.0%; sensitivity: 97.3%/97.9%; specificity:77.1%/65.6%; PPV: 89.3%/84.8%; NPV: 93.7%/94.0%. Novice 1/2/3/4: accuracy: 88.0%/84.2%/80.3%/85.6%; sensitivity: 97.3%/93.6%/81.9%/84.0%; specificity: 69.8%/65.6%/77.1%/88.5%; PPV: 86.3%/84.2%/87.5%/93.5%; NPV: 93.1%/84.0%/68.5%/73.9% NA
Gong et al30 China CAD Adenoma Video ADR: 16% ADR: 8%
Su et al31 China CNN Adenoma Video ADR: 28.9% ADR: 16.5%
Wang et al32 China CAD Adenoma Video ADR: 29.1% ADR: 20.3%
Wang et al33 China CAD Adenoma Video ADR: 34% ADR: 28%
Renner et al34 Germany CAOB Polyp Images Accuracy: 78.0%; sensitivity: 92.3%; specificity: 62.5%; PPV: 72.7%; NPV: 88.2% Expert 1/2: zccuracy: 84.0%/77.0%; sensitivity: 92.3%/73.1%; specificity: 75.0%/81.3%; PPV: 80.0%/80.9%; NPV: 90.0%/73.6% NA
ADR indicates adenoma detection rate; AI, artificial intelligence; AUROC, area under receiver operating characteristic curves; BE, Barrett’s esophagus; BLI, blue laser imaging; CAD, computer-aided; CAOB, computer-assisted optical biopsy; CE, capsule endoscopy; CNN, convolutional neural network; DL, deep-learning; ESCC, early squamous cell carcinoma; GIA, gastrointestinal angiectasia; LCI, linked color imaging; ML, machine learning; NPV, negative predictive value; PDR, polyp detection rate; PPV, positive predictive value; SCC, squamous cell carcinoma; WLI, white light imaging.

ESOPHAGUS

The main advantages of AI applied in esophageal endoscopy mainly constitute increasing the diagnostic accuracy of dysplasia in Barrett’s esophagus (BE) and esophageal squamous cell carcinoma (SCC).35

BE

BE is the most significant risk factor of developing early-stage esophageal adenocarcinoma (ESEA).36 The diagnosis of ESEA in BE relies on endoscopic screen and biopsy. However, it is usually flat and difficult to distinguish from normal surrounding mucosa.37 Therefore, performing an accurate diagnosis is challenging, even for the most experienced endoscopists.38 The diagnostic sensitivity and specificity for high-grade dysplasia and esophageal adenocarcinoma (EA), even when BE experts perform the endoscopy, are only 80% and 89%, respectively.39 In comparison, when endoscopists with less experience perform this labor-intensive and time-intensive task, the sensitivity is significantly lower at 64%.40 In 2012, the American Society of Gastrointestinal Endoscopy set the threshold for optical diagnosis of high-grade dysplasia and EA at a sensitivity of 90%, a specificity of 80%, and a negative predictive value (NPV) of 98%.41 However, these thresholds are hardly reached, even for BE experts. Therefore, CAD systems have been developed to assist ordinary endoscopists in obtaining better diagnostic performances in clinical practice.

With the use of a simple endoscopic classification system involving ML, nonexpert endoscopists could significantly improve their dysplasia detection performance (sensitivity, specificity, and accuracy) in BE.11 Moreover, a CAD system using DL also improves the detection and diagnosis performance of BE and ESEA.12 Assessment of still images involving 2 data bases [Augsburg data and Medical Image Computing and Computer-Assisted Intervention (MICCAI)] revealed that the sensitivity/specificity of the CAD-DL is 97%/88%, and 92%/100%, respectively. Recently, de Groof et al13 conducted a deep-learning system to detect neoplasia in patients with BE. Their system received a near-perfect performance (a sensitivity of 88%, a specificity of 93%, and an accuracy of 83%), which is much better than general endoscopists (a sensitivity of 72%, a specificity of 74%, and an accuracy of 73%).

Furthermore, a machine based on the real-time use of AI has been developed for the evaluation of ESEA in BE.15,42 This equipment significantly improves the diagnostic performances for nonexpert endoscopists in real time. The sensitivity, specificity, and overall accuracy of this equipment are 83.7%, 100%, and 89.9%, respectively. Besides, an AI system developed by Hashimoto et al14 has a satisfactory performance of a sensitivity of 96.4%, a specificity of 94.2% and an accuracy of 95.4% in the diagnosis of ESEA in BE.

Esophageal SCC

The incidence of EA is rapidly increasing worldwide.1 However, SCC is still the most common type of cancer in all esophageal malignancies, accounting for ~80%.1 It is often confirmed histologically as obvious an evident mass-like features. However, due to inadequately experienced endoscopists, low-quality equipment, or other relevant factors, there are still unsatisfactory rates of missed diagnosis of SCC.43 Therefore, there is an urgent need for types of equipment, which can assist endoscopists in achieving better diagnostic performance.

The findings of 2 studies showed that the CAD system based on DL techniques has satisfactory diagnosis performance with a sensitivity of between 89.3% and 97.8%, and a specificity of between 85.4% and 98%.16,17 By training with CAD-DL system, the average diagnostic ability of nonexperienced endoscopists has been improved (sensitivity: 74.2% vs. 89.2%, accuracy: 81.7% vs. 91.1%, NPV: 79.3% vs. 90.4%).16

Among SCC characteristics, tumor invasion depth is the most important risk factor, believed to be closely associated with the risk of metastasis and curability of endoscopic resection.44 A CAD-DL system based on Single Shot MultiBox Detector architecture successfully classified superficial SCC into pathologic mucosa, submucosal microinvasive (SM1), and deep submucosal invasive (SM2/3) cancer with a sensitivity of 90.1%, a specificity of 95.8%, positive predictive value (PPV) of 99.2%, NPV of 63.9%, and overall accuracy of 90.1%, respectively.18 On the contrary, the diagnoses performed by sixteen experienced endoscopists yielded relatively poorer performance (sensitivity: 89.8%, specificity: 88.3%, PPV: 97.9%, NPV: 65.5%, and accuracy: 89.6%). Similar findings were reported in a second study, which revealed that the accuracy score of the CAD-DL system exceeds that of 12 of 13 experienced endoscopists.19 Moreover, the area under receiver operation characteristic curve (AUROC) of the CAD-DL system is better than that of all endoscopists.

Moreover, Guo et al20 developed a CAD-DL system, which performs real-time automated diagnosis of precancerous lesions and early SCC. This system has high sensitivity and specificity for both images and video databases. The sensitivity and specificity based on image databases are 98.04% and 95.03%, respecitively. Besides, the diagnostic performances based on video databases for the precancerous lesions or early SCC includes 27 nonmagnifying videos (per-frame sensitivity 60.8%, per lesion sensitivity 100%) and 20 magnifying videos (per-frame sensitivity 96.1%, per lesion sensitivity 100%). The unaltered full-range normal esophagus video includes 33 videos (per frame specificity 99.9%, per-case specificity 90.9%).

STOMACH

Gastric cancer (GC) is the fifth most frequently diagnosed malignancy worldwide, which had ~1 million new cases and caused 800,000 deaths in 2018 alone.1 The common risk factors of GC constitute infection of H. pylori, obesity, consumption of food preserved by high-dose salt, low fruit intake, and use of alcohol and tobacco.1 GC is most frequently diagnosed using gastroscopy. Gastroscopy offers invaluable assistance in the management of GC through: (1) Prediction of H. pylori infection. (2) Detection of precancerous lesions. (3) Diagnosis of early gastric cancer (EGC). (4) Assessment of tumor metastasis and invasion depth. (5) Assistance of application of endoscopic submucosal dissection (ESD). Recently, with the advancements made in medical technology, the AI system has been incorporated into endoscopy and plays a vital role in helping endoscopists in achieving better diagnostic performance.

H. pylori Infection

Among the risk factors of the pathogenesis of cancer, H. pylori infection is the most important one, which is responsible for ~90% of new cases of noncardia GC.45 In addition, research evidence shows that the treatment of H. pylori infection decreases the incidence of GC.46,47 Therefore, H. pylori infection is categorized as a definite carcinogen by various international agencies, and H. pylori eradication therapy has become more prevalent recently.48 The typical features of the white-light endoscopic images of different gastric mucosa status include (1) H. pylori-positive gastric mucosa: atrophy, diffuse redness, mucosal swelling, enlarged folds, and nodularity. (2) H. pylori-negative gastric mucosa: a regular arrangement of collecting venules and fundic gland polyps. (3) H. pylori-eradicated gastric mucosa: map-like redness.49 However, the diagnosis based on these representative endoscopic images is subjective and inaccurate.50 There is still a high rate of false-positives and false-negatives, even for expert endoscopists. However, better diagnostic accuracy has been achieved recently with the incorporation of AI technology in endoscopy.

Shichijo et al21 constructed a CNN to classify different gastric mucosa status. On the basis of the nonmagnified endoscopic images, the CNN system achieves a comparable diagnostic result to the expert endoscopists (diagnostic accuracy of H. pylori-positive cases: 80% vs. 88.9%, H. pylori-negative cases: 84% vs. 55.8%, H. pylori-eradicated cases: 48% vs. 62.1%). Among them, the accuracy of H. pylori-eradicated cases is low because the gastric mucosa presents gastritis-like appearance after H. pylori eradication therapy.

Yasuda et al22 developed the AI system using linked color imaging technique to ascertain the H. pylori infection status. The sensitivity, specificity, accuracy, PPV, and NPV of this system constitute 90.4%, 85.8%, 87.6%, 80.9%, and 93.1%, respectively. Its diagnostic performance significantly higher compared with inadequately experienced endoscopists. However, there is no significant difference in the diagnostic performance between this system and experienced endoscopists.

Moreover, Zheng et al23 developed a CAD support system with an incorporated CNN model based on endoscopic images for the diagnosis of H. pylori infection. The novelty of this system constitutes the use of multiple images instead of a conventional single one providing diagnostic results; hence, significantly increasing accuracy. The sensitivity, specificity, and accuracy of the CAD system with single image are 81.4%, 90.1%, and 84.5%, respectively. On the contrary, the sensitivity, specificity, and accuracy of the CAD system with multiple images are 91.6%, 98.6%, and 93.8%, respectively. Moreover, the CAD-multiple image system has a significantly higher specificity and sensitivity compared with other direct H. pylori testing methods, that is, sensitivity: histology 88% to 92%, breath test 96%, stool antigen 94%; specificity: histology 89% to 98%, breath test 93%, stool antigen 97%. Therefore, this system has the potential to be a readily, efficient, and noninvasive tool for the diagnosis of H. pylori infection, which could popularize it in clinical practice.

Gastric Polyps

Zhang et al24 designed a CNN system based on Single Shot MultiBox Detector (SSD) for gastric polyps detection. The novelty for the SSD of the Gastric polyps (SSD-GPNet) system allows real-time detection of gastric ploys. Regarding its diagnostic accuracy, compared with conventional SSD, the SSD-GPNet has a little time-performance disadvantage (8.05 s vs. 6.48 s). However, the SSD-GPNet has a better diagnostic precision (for small polyps: 66.67% vs. 54.55%, for medium ploys: 90.79% vs. 80.26%, and for large polyps: 85.71% vs. 85.71%). Thus, this has significant potential in considerably assisting endoscopists to avoid misdiagnosis.

GC

EGC, defined as tumor invasive depth, is no more than a submucosa and accounts for ~20% of all gastric cancer cases with better prognosis than advanced gastric cancer.51 Due to the advancements in endoscopic therapeutic techniques, most of EGC cases are treated using ESD. It has minimal invasion and quite short hospital-length; hence, well accepted by the endoscopists and patients.52 However, according to guidelines, ESD is only recommended for EGC patients whose tumor invasion depth restricted to the mucosa (M) or the superficial portion of the submucosa (SM1).53 Therefore, precise detection of tumor invasion depth is critical since it is closely associated with the therapeutic strategy and cancer-related prognosis. Currently, endoscopic ultrasonography (EUS) constitutes the most preferred method for the detection of invasion depth. The EUS method has a moderate diagnostic value for the determination of the invasion depth of EGC.54 However, this method has 2 limitations, namely, difficulty in providing high-quality images of some parts of the stomach since it uses a microprobe. Second, the approach is highly operator-dependent. Therefore, the diagnostic performance of EUS is not always satisfactory. In some cases, the method is inferior to conventional endoscopy performed by experienced endoscopists.55 Some macroscopic features, including remarkable redness, stiffness of the gastric mucosa, disappearance of the mucosal layer and abrupt cutting of converging folds, are closely related to the submucosal or deeper invasion.56 However, diagnosis by conventional endoscopy is subjective and highly experience-dependent, which is additionally unstable. Hence, it is crucial to develop appropriate equipment or techniques to increase the diagnostic accuracy of determining tumor invasion depth.

Yoon et al25 developed a visual geometry group (VGG)-16 model based on the CNN system for the diagnosis of tumor invasion depth of EGC. The diagnostic performance of this system includes a sensitivity of 79.2%, a specificity of 77.8%, PPV of 79.3%, NPV of 77.7%, and AUROC of 0.851. This moderate diagnostic performance is accepted as a satisfactory result, considering the difficulties of identifying tumor invasion depth of EGC. In addition, this diagnostic accuracy is better compared with EUS reported previously.57

Moreover, Zhu et al26 developed a CNN-CAD system for determining tumor invasion depth of GC. The diagnostic performances between the CNN-CAD system and 17 endoscopists compares as follows: sensitivity (79.47% vs. 87.80%), specificity (95.56% vs. 63.31%), PPV (89.66% vs. 55.86%), NPV (88.97% vs. 91.01%), and accuracy (89.16% vs. 71.49%). The specificity and accuracy of the CNN-CAD system is significantly higher compared with endoscopists regardless of the level of experience.

SMALL INTESTINE (SI)

The detection of the abnormalities of the SI, including erosions, ulcerations, angiodysplasias, erythema, edema, and change of the villi, is difficult, because of the anatomic characteristics of the small bowel and limitations of diagnostic equipment. However, recently, the development of the capsule endoscopy (CE), a diagnostic, monitoring, and managing tool, provides a solution to this problem.58 Moreover, the European guidelines recommended the use of CE for suspected Crohn’s disease with negative colonoscopy detection, suspected SI tumors and inherited polyposis syndrome.59 However, CE still has some drawbacks. First, pathologic changes exist in only several frames, sometimes even in one frame, which constitutes the main limitation. Second, the length of the whole CE video varies from 8 to 10 hours; As a result, the endoscopist should take 1 to 2 hours to browse through and report the examination result. Therefore, the approach is time-consuming as well as labor-intensive. Thus, abnormalities are easily missed due to oversight or inexperience. However, recently, AI has been integrated into the CE to improve the diagnostic efficiency and accuracy, and significantly reduce the burden on the endoscopist.

Gastrointestinal Angiectasia (GIA)

GIA, defined as bright-red, flat-lesion, consisting of tortuous and clustered capillary dilatations, is the most common small-bowel vascular abnormality.60 It is closely related to the occurrence of GI hemorrhage.61 To increase labor efficiency and diagnostic accuracy of GIA, Leenhardt et al27 developed a CAD system, which shows satisfactory diagnostic performance with a sensitivity of 100%, a specificity of 96%, a PPV of 96%, and a NPV of 100%. In addition, the browsing period of the entire CE video the endoscopist takes is significantly reduced from more than one hour to an average of 39 minutes in the CAD system.

Erosions and Ulcerations

The small-bowel CE video mostly reveals mucosal breaks that represents the abuse of nonsteroidal noninflammatory drugs (NSAIDs), Crohn’s disease or malignancy.62 The mucosal breaks are, however, difficult to detect through CE due to the small difference in color between the mucosal breaks and the surrounding normal mucosa.63 Aoki et al28 designed a deep neural network architecture called the Single Shot MultiBox Detector, which is a CNN system that consists of 16 or more layers for the diagnosis of the mucosa breaks. The diagnostic performance of this system constitutes AUROC of 0.958, a sensitivity of 88.2%, a specificity of 90.9%, and an accuracy of 90.8%.

COLORECTUM

Colorectal cancer is the third most common malignancy worldwide, which had ~1,800,000 new cases and 881,000 deaths, implying 1 in 10 cancer cases and deaths in 2018 alone.1 Colonoscopy is effective and essential in the early diagnosis and prevention of colorectal cancer through detection and removal of the neoplastic lesion.64 However, it is far from being perfect. The method has several limitations. First, the method has a relatively-high missed diagnosis rate of precancerous lesions.65 Second, some neoplastic lesions are challenging to detect, even for expert endoscopists.66 Lastly, the task of the endoscopist is time-intensive and labor-intensive. Because of high energy consumption, the endoscopists are prone to inattention, leading to missed diagnosis. At the same time, the diagnostic performance of colonoscopy highly depends on the experience of the endoscopists, which varies among individuals. This implies that the diagnostic accuracy of colonoscopy is unstable.

Recently, CAD system based on AI technology has been shown to have a potential regarding boosting the efficiency and accuracy, as well as an almost instantaneous supportive tool, for the endoscopists. The American Society of Gastrointestinal Endoscopy published Preservation and Incorporation of Valuable Endoscopic Innovations (PIVI) in 2015.67 The threshold of a diagnose-and-leave strategy for small colorectal polyps is NPV≥90%. At the same time, the threshold of a resect-and-discard strategy is above 90% of agreement with histopathology for postpolypectomy surveillance intervals. Some CAD systems already have or exceed this standard. Moreover, this novel technology has not only been applied in the field of colorectal polyps detection but also been used in assisting diagnosis of other colorectal diseases (early-stage colorectal cancer, ulcerative colitis, invasive depth of cancer and iron deficiency anemia).

Computer-Aided Detection for Colorectal Polyps

The detection and diagnosis of colorectal polyps is the most extensive field in which AI technology has been applied to assist the endoscopist. A prospective study involving 1100 patients conducted to compare the diagnostic performance of colonoscopy with or without CAD system assistance.68 Notably, the colonoscopy without CAD assistance detected 248 polyps, whereas colonoscopy with CAD assistance detected 486 polyps. Following careful evaluation, it was observed that the colonoscopy with CAD assistance did not miss any polyp missed. As a result, the findings of this study revealed that the CAD system significantly increases the diagnostic accuracy regardless of the bowel preparation results.

The detection and diagnosis of colorectal polyps is the most extensive field in which AI technology has been applied to assist the endoscopists. A prospective study involving 1100 patients conducted to compare the diagnostic performance of colonoscopy with or without CAD system assistance. Notably, the colonoscopy without CAD assistance detected 248 polyps, whereas colonoscopy with CAD assistance detected 486 polyps. Following careful evaluation, it was observed that the colonoscopy with CAD assistance did not miss any polyp. As a result, the findings of this study revealed that the CAD system significantly increases the diagnostic accuracy regardless of the bowel preparation results.69

Narrow-band imaging is a type of image-enhanced technique that is prevalent among endoscopists for the detection of microstructures and microvascular abnormalities of the mucosal epithelium.70 However, an optical diagnosis of hyperplastic and adenomatous polyps based on narrow band imaging highly depends on the experience of the endoscopists, whereas the result is not satisfactory.71,72 Chen et al29 designed a CNN-CAD system to overcome this challenge. This system has a sensitivity of 96.3%, a specificity of 78.1%, a PPV of 89.6%, and an NPV of 91.5%. Moreover, the diagnostic time of the CNN-CAD system is significantly shorter (0.45±0.07s vs. 1.54±1.30s experts/ 1.77±1.37s nonexperts) compared with endoscopists. In addition, this system has a perfect intraobserver agreement (κ score of 1 vs. 0.67 experts/0.48 to 0.77 nonexperts).

Computer-aided Diagnosis of Colorectal Polyps

Renner et al34 developed a computer-assisted optical biopsy (CAOB) approach based on CNN to distinguish neoplastic polyps from non-neoplastic polyps with pathologic diagnosis as the golden standard. This system is trained using unmagnified white-light and narrow band imaging endoscopic pictures. The CAOB system has a sensitivity of 92.3% and a specificity of 88.2%. At the same time, 2 expert endoscopists had a sensitivity and specificity of 84.0% and 77.0%, respectively. Therefore, the diagnostic performance of the CAOB approach is better compared with human-experts, although not statistically significant. However, a discrepancy often occurs between high-confidence optical diagnosis of expert endoscopists and subsequent pathologic diagnosis, especially for diminutive adenoma.73 Shahidi et al74 developed a real-time clinical decision support solution (CDSS) based on AI technology to evaluate the discrepancy between endoscopic and pathologic diagnoses of diminutive lesions (diameter smaller than 3 mm). A study involving 644 lesions indicated that only 458 (71.7%) had concordant pathologic diagnoses and 186, 28.9% had differing results. Among the differing results, CDSS and endoscopists had the same diagnoses in 168 (90.3%) lesions. This reveals that pathology may not be used as the golden standard in diagnosing diminutive colorectal lesions (diameter smaller than 3 mm).

Application in the Field of Adenoma Detection

Adenoma detection rate (ADR) is adversely associated with the risk of developing a CRC: each 1.0% increase in ADR will trigger 3.0% decrease in the risk of interval CRC.75 Though considerable methods have been taken to improve ADR, including new colonoscopy equipment, minimal withdrawal time, retroflexion, and split-dose bowel preparation, some diminutive adenoma will still be missed. The following hypotheses may contribute to this phenomenon: lack of experience and training, differences in tracking patterns, and distraction caused by fatigue or emotional factors.76 Several studies suggested that the AI system could effectively assist human endoscopists to overcome the disadvantages of unstable performance, thus improving the ADR. Gong et al30 found the performance of ADR of their CAD system (16%) was better that of the control group (8%). Meanwhile, similar findings were also published in other articles.31–33 Most importantly, these studies are all randomized control trails, and the real-time application of AI system means it is valuable in clinical practice.

LIMITATIONS AND PERSPECTIVE OF AI TECHNOLOGY APPLIED IN THE FIELD OF ENDOSCOPY

Recently, several commercially available AI systems have been introduced in public, including GI genius (Medtronic), CAD EYE (Fuji), DISCOVERY (Hoya), and EndoBRAIN (Cybernet and Olympus). This means it is not far from application in clinical practice. Although the integration of AI technology into endoscopy has shown its potential in assisting the endoscopists in achieving high diagnostic performance, it still has several limitations.

First, most of the images and videos extracted from databases to train AI system are highly qualified. These ideal images usually result in selection bias. The AI systems are often unable to distinguish lesions in low-quality images or videos. Their performances are excellent in the training set but weak in the clinical practice, which is defined as overfitting. Therefore, further studies should be conducted to address this shortcoming and adapt to the unsatisfactory bowel preparation condition, including the residual or bubble existing in the colon, which is common in a real-world scenario.

Second, most studies focusing on AI technology applied in endoscopy are retrospective, leading to selection bias. Therefore, studies should be conducted using different study designs, such as randomized controlled trial and single-arm study.

Third, a small sample size results in class imbalance or poor diagnostic accuracy. For instance, images used for AI training are usually challenging to find, such as subtle flat colonic lesions or morphology types that are insufficient in quantity in the endoscopy image database. Besides, substantial learning material is vital to increase the diagnostic accuracy of the AI system. AI system should be trained with images of a balanced proportion of neoplastic and non-neoplastic, polyp and nonpolyp, and high-quality and low-quality. However, this big-data problem for AI training remains a challenge and an obstacle to the prevalence of this novel technology.

Finally, additional information, such as gender, age, family history, and laboratory test results, is a crucial resource for clinicians in performing accurate diagnosis. However, most recent AI algorithms focus on images only. Nevertheless, in some studies, this issue has been identified. Hornbrook et al77 and Hilsden et al78 conducted ML algorithms based on the basic information and laboratory test results of the patients to perform early prediction of colorectal cancer. If these ML algorithms are associated with endoscopy images, their diagnostic performance would be significantly higher.

With the advancements in AI technology, an ideal AI system should be developed to overcome these limitations. It may precisely distinguish different lesions from normal surrounding mucosa, including those rare lesions. Meanwhile, it may assist endoscopists simultaneously during endoscopy with almost undetectable latency. Furthermore, it may provide type, location, size, depth, and other relevant information of lesions.

CONCLUSION

The integration of AI into endoscopy has dramatically improved the diagnostic performance of GI diseases. AI can provide immediate assistance to inadequately experienced endoscopists. Meanwhile, AI compensates for unsteady performance, as well as alleviates fatigue and increases the working efficacy of human endoscopists; therefore, significantly reducing the missed diagnosis rate of subtle lesions. Although there are still some limitations of the AI system, robust clinical trials, increasing industry involvement, and governmental incentives will open the door to the continuous evolution of the AI system, thus providing support to lesion detection, as well as technical quality assessment and therapeutic decision-making processes.

ACKNOWLEDGMENTS

The authors thank Dr Peng Jiang and Dr Hai-Feng Tang for their critical reading and informative advice during the study process. Meanwhile, the authors thank Freescience for language polishing.

REFERENCES

1. Bray F, Ferlay J, Soerjomataram I, et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2018;68:394–424.
2. Lambert R. Prevention of gastrointestinal cancer by surveillance endoscopy. EPMA J. 2010;1:473–483.
3. Akarsu M, Akarsu C. Evaluation of new technologies in gastrointestinal endoscopy. JSLS. 2018;22:e2017.00053.
4. DeWitt J, Van, Dam J. Development of endoscopy-gastroenterology diamond jubilee review. Gastroenterology. 2018;155:237–240.
5. Arnold M, Sierra MS, Laversanne M, et al. Global patterns and trends in colorectal cancer incidence and mortality. Gut. 2017;66:683–691.
6. Minsky ML, Rochester N, Shannon CE. A proposal for the Dartmouth Summer Research Project on artificial intelligence. AI Magazine. 2006;27:12–14.
7. Suzuki K. Overview of deep learning in medical imaging. Radiol Phys Technol. 2017;10:257–273.
8. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521:436–444.
9. Liedlgruber M, Uhl A. Computer-aided decision support systems for endoscopy in the gastrointestinal tract: a review. IEEE Rev Biomed Eng. 2011;4:73–88.
10. Shinozaki S, Osawa H, Hayashi Y, et al. Linked color imaging for the detection of early gastrointestinal neoplasms. Therap Adv Gastroenterol. 2019;12:1756284819885246.
11. Sehgal V, Rosenfeld A, Graham DG, et al. Machine learning creates a simple endoscopic classification system that improves dysplasia detection in Barrett’s Oesophagus amongst non-expert endoscopists. Gastroenterol Res Pract. 2018;2018:1872437.
12. Ebigbo A, Mendel R, Probst A, et al. Computer-aided diagnosis using deep learning in the evaluation of early oesophageal adenocarcinoma. Gut. 2019;68:1143–1145.
13. de Groof AJ, Struyvenberg MR, van der Putten J, et al. Deep-learning system detects neoplasia in patients with Barrett’s esophagus with higher accuracy than endoscopists in a multistep training and validation study with benchmarking. Gastroenterology. 2020;158:915–929.
14. Hashimoto R, Requa J, Dao T, et al. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett’s esophagus (with video). Gastrointest Endosc. 2020;91:1264–1271.
15. Ebigbo A, Mendel R, Probst A, et al. Real-time use of artificial intelligence in the evaluation of cancer in Barrett’s oesophagus. Gut. 2020;69:615–616.
16. Cai S-L, Li B, Tan W-M, et al. Using a deep learning system in endoscopy for screening of early esophageal squamous cell carcinoma (with video). Gastrointest Endosc. 2019;90:745–753.
17. Everson M, Herrera L, Li W, et al. Artificial intelligence for the real-time classification of intrapapillary capillary loop patterns in the endoscopic diagnosis of early oesophageal squamous cell carcinoma: a proof-of-concept study. United European Gastroenterol J. 2019;7:297–306.
18. Nakagawa K, Ishihara R, Aoyama K, et al. Classification for invasion depth of esophageal squamous cell carcinoma using a deep neural network compared with experienced endoscopists. Gastrointest Endosc. 2019;90:407–414.
19. Tokai Y, Yoshio T, Aoyama K, et al. Application of artificial intelligence using convolutional neural networks in determining the invasion depth of esophageal squamous cell carcinoma. Esophagus. 2020;17:250–256
20. Guo L, Xiao X, Wu C, et al. Real-time automated diagnosis of precancerous lesions and early esophageal squamous cell carcinoma using a deep learning model (with videos). Gastrointest Endosc. 2020;91:41–51.
21. Shichijo S, Endo Y, Aoyama K, et al. Application of convolutional neural networks for evaluating Helicobacter pylori infection status on the basis of endoscopic images. Scand J Gastroenterol. 2019;54:158–163.
22. Yasuda T, Hiroyasu T, Hiwa S, et al. Potential of automatic diagnosis system with linked color imaging for diagnosis of Helicobacter pylori infection. Dig Endosc. 2020;32:373–381.
23. Zheng W, Zhang X, Kim JJ, et al. High accuracy of convolutional neural network for evaluation of Helicobacter pylori infection based on endoscopic images: preliminary experience. Clin Transl Gastroenterol. 2019;10:e00109.
24. Zhang X, Chen F, Yu T, et al. Real-time gastric polyp detection using convolutional neural networks. PLoS One. 2019;14:e0214133.
25. Yoon HJ, Kim S, Kim J-H, et al. A lesion-based convolutional neural network improves endoscopic detection and depth prediction of early gastric cancer. J Clin Med. 2019;8:1310.
26. Zhu Y, Wang Q-C, Xu M-D, et al. Application of convolutional neural network in the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest Endosc. 2019;89:806–815.
27. Leenhardt R, Vasseur P, Li C, et al. A neural network algorithm for detection of GI angiectasia during small-bowel capsule endoscopy. Gastrointest Endosc. 2019;89:189–194.
28. Aoki T, Yamada A, Aoyama K, et al. Automatic detection of erosions and ulcerations in wireless capsule endoscopy images based on a deep convolutional neural network. Gastrointest Endosc. 2019;89:357–363.
29. Chen P-J, Lin M-C, Lai M-J, et al. Accurate classification of diminutive colorectal polyps using computer-aided analysis. Gastroenterology. 2018;154:568–575.
30. Gong D, Wu L, Zhang J, et al. Detection of colorectal adenomas with a real-time computer-aided system (ENDOANGEL): a randomised controlled study. Lancet Gastroenterol Hepatol. 2020;5:352–361.
31. Su J-R, Li Z, Shao X-J, et al. Impact of a real-time automatic quality control system on colorectal polyp and adenoma detection: a prospective randomized controlled study (with videos). Gastrointest Endosc. 2020;91:415–424.
32. Wang P, Berzin TM, Glissen Brown JR, et al. Real-time automatic detection system increases colonoscopic polyp and adenoma detection rates: a prospective randomised controlled study. Gut. 2019;68:1813–1819.
33. Wang P, Liu X, Berzin TM, et al. Effect of a deep-learning computer-aided detection system on adenoma detection during colonoscopy (CADe-DB trial): a double-blind randomised study. Lancet Gastroenterol Hepatol. 2020;5:343–351.
34. Renner J, Phlipsen H, Haller B, et al. Optical classification of neoplastic colorectal polyps - a computer-assisted approach (the COACH study). Scand J Gastroenterol. 2018;53:1100–1106.
35. Thakkar SJ, Kochhar GS. Artificial intelligence for real-time detection of early esophageal cancer: another set of eyes to better visualize. Gastrointest Endosc. 2020;91:52–54.
36. Weismüller J, Thieme R, Hoffmeister A, et al. Barrett-Screening: rational, current concepts and perspectives. Z Gastroenterol. 2019;57:317–326.
37. Eluri S, Shaheen NJ. Barrett’s esophagus: diagnosis and management. Gastrointest Endosc. 2017;85:889–903.
38. Davis-Yadley AH, Neill KG, Malafa MP, et al. Advances in the endoscopic diagnosis of barrett esophagus. Cancer Control. 2016;23:67–77.
39. Sharma P, Bergman JJGHM, Goda K, et al. Development and validation of a classification system to identify high-grade dysplasia and esophageal adenocarcinoma in Barrett’s esophagus using narrow-band imaging. Gastroenterology. 2016;150:591–598.
40. Curvers WL, van Vilsteren FG, Baak LC, et al. Endoscopic trimodal imaging versus standard video endoscopy for detection of early Barrett’s neoplasia: a multicenter, randomized, crossover study in general practice. Gastrointest Endosc. 2011;73:195–203.
41. Sharma P, Savides TJ, Canto MI, et al. The American Society for Gastrointestinal Endoscopy PIVI (Preservation and Incorporation of Valuable Endoscopic Innovations) on imaging in Barrett’s esophagus. Gastrointest Endosc. 2012;76:252–254.
42. Hashimoto R, Requa J, Tyler D, et al. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett’s esophagus (with video). Gastrointest Endosc. 2020;91:1264–1274
43. Rodríguez de Santiago E, Hernanz N, Marcos-Prieto HM, et al. Rate of missed oesophageal cancer at routine endoscopy and survival outcomes: a multicentric cohort study. United European Gastroenterol J. 2019;7:189–198.
44. Chang AC, Ji H, Birkmeyer NJ, et al. Outcomes after transhiatal and transthoracic esophagectomy for cancer. Ann Thorac Surg. 2008;85:424–429.
45. Plummer M, Franceschi S, Vignat J, et al. Global burden of gastric cancer attributable to Helicobacter pylori. Int J Cancer. 2015;136:487–490.
46. Ford AC, Forman D, Hunt RH, et al. Helicobacter pylori eradication therapy to prevent gastric cancer in healthy asymptomatic infected individuals: systematic review and meta-analysis of randomised controlled trials. BMJ. 2014;348:g3174.
47. Choi IJ, Kook M-C, Kim Y-I, et al. Helicobacter pylori therapy for the prevention of metachronous gastric cancer. N Engl J Med. 2018;378:1085–1095.
48. Suzuki H, Mori H. World trends for H. pylori eradication therapy and gastric cancer prevention strategy by H. pylori test-and-treat. J Gastroenterol. 2018;53:354–361.
49. Kato M, Terao S, Adachi K, et al. Changes in endoscopic findings of gastritis after cure of H. pylori infection: multicenter prospective trial. Dig Endosc. 2013;25:264–273.
50. Redéen S, Petersson F, Jönsson KA, et al. Relationship of gastroscopic features to histological findings in gastritis and Helicobacter pylori infection in a general population sample. Endoscopy. 2003;35:946–950.
51. Everett SM, Axon AT. Early gastric cancer in Europe. Gut. 1997;41:142–150.
52. Chung I-K, Lee JH, Lee S-H, et al. Therapeutic outcomes in 1000 cases of endoscopic submucosal dissection for early gastric neoplasms: Korean ESD Study Group multicenter study. Gastrointest Endosc. 2009;69:1228–1235.
53. Ono H, Yao K, Fujishiro M, et al. Guidelines for endoscopic submucosal dissection and endoscopic mucosal resection for early gastric cancer. Dig Endosc. 2016;28:3–15.
54. Shi D, Xi X-X. Factors affecting the accuracy of endoscopic ultrasonography in the diagnosis of early gastric cancer invasion depth: a meta-analysis. Gastroenterol Res Pract. 2019;2019:8241381.
55. Choi J, Kim SG, Im JP, et al. Comparison of endoscopic ultrasonography and conventional endoscopy for prediction of depth of tumor invasion in early gastric cancer. Endoscopy. 2010;42:705–713.
56. Cheng J, Wu X, Yang A, et al. Model to identify early-stage gastric cancers with deep invasion of submucosa based on endoscopy and endoscopic ultrasonography findings. Surg Endosc. 2018;32:855–863.
57. Kim J, Kim SG, Chung H, et al. Clinical efficacy of endoscopic ultrasonography for decision of treatment strategy of gastric cancer. Surg Endosc. 2018;32:3789–3797.
58. Aktas H, Mensink PB. Small bowel diagnostics: current place of small bowel endoscopy. Best Pract Res Clin Gastroenterol. 2012;26:209–220.
59. Rondonotti E, Spada C, Adler S, et al. Small-bowel capsule endoscopy and device-assisted enteroscopy for diagnosis and treatment of small-bowel disorders: European Society of Gastrointestinal Endoscopy (ESGE) Technical Review. Endoscopy. 2018;50:423–446.
60. Leenhardt R, Li C, Koulaouzidis A, et al. Nomenclature and semantic description of vascular lesions in small bowel capsule endoscopy: an international Delphi consensus statement. Endosc Int Open. 2019;7:E372–E379.
61. Becq A, Rahmi G, Perrod G, et al. Hemorrhagic angiodysplasia of the digestive tract: pathogenesis, diagnosis, and management. Gastrointest Endosc. 2017;86:792–806.
62. Goenka MK, Majumder S, Kumar S, et al. Single center experience of capsule endoscopy in patients with obscure gastrointestinal bleeding. World J Gastroenterol. 2011;17:774–778.
63. Iakovidis DK, Koulaouzidis A. Automatic lesion detection in capsule endoscopy based on color saliency: closer to an essential adjunct for reviewing software. Gastrointest Endosc. 2014;80:877–883.
64. Winawer SJ, Zauber AG, Ho MN, et al. Prevention of colorectal cancer by colonoscopic polypectomy. The National Polyp Study Workgroup. N Engl J Med. 1993;329:1977–1981.
65. Anderson R, Burr NE, Valori R. Causes of post-colonoscopy colorectal cancers based on World Endoscopy Organization System of Analysis. Gastroenterology. 2020;158:1287–1299.
66. Yamada M, Sakamoto T, Otake Y, et al. Investigating endoscopic features of sessile serrated adenomas/polyps by using narrow-band imaging with optical magnification. Gastrointest Endosc. 2015;82:108–117.
67. Chandrasekhara V, Desilets D, Falk GW, et al. The American Society for Gastrointestinal Endoscopy PIVI (Preservation and Incorporation of Valuable Endoscopic Innovations) on peroral endoscopic myotomy. Gastrointest Endosc. 2015;81:1087–1100.
68. Wang P, Xiao X, Glissen Brown JR, et al. Development and validation of a deep-learning algorithm for the detection of polyps during colonoscopy. Nat Biomed Eng. 2018;2:741–748.
69. Liu W-N, Zhang Y-Y, Bian X-Q, et al. Study on detection rate of polyps and adenomas in artificial-intelligence-aided colonoscopy. Saudi J Gastroenterol. 2020;26:13–19.
70. Tanaka S, Sano Y. Aim to unify the narrow band imaging (NBI) magnifying classification for colorectal tumors: current status in Japan from a summary of the consensus symposium in the 79th Annual Meeting of the Japan Gastroenterological Endoscopy Society. Dig Endosc. 2011;23(suppl 1):131–139.
71. Hewett DG, Kaltenbach T, Sano Y, et al. Validation of a simple classification system for endoscopic diagnosis of small colorectal polyps using narrow-band imaging. Gastroenterology. 2012;143:599–607.
72. Kuiper T, Marsman WA, Jansen JM, et al. Accuracy for optical diagnosis of small colorectal polyps in nonacademic settings. Clin Gastroenterol Hepatol. 2012;10:1016–1020.
73. Rex DK. Narrow-band imaging without optical magnification for histologic analysis of colorectal polyps. Gastroenterology. 2009;136:1174–1181.
74. Shahidi N, Rex DK, Kaltenbach T, et al. Use of endoscopic impression, artificial intelligence, and pathologist interpretation to resolve discrepancies between endoscopy and pathology analyses of diminutive colorectal polyps. Gastroenterology. 2020;158:783–785.
75. Corley DA, Levin TR, Doubeni CA. Adenoma detection rate and risk of colorectal cancer and death. N Engl J Med. 2014;370:2541.
76. Rogart JN, Siddiqui UD, Jamidar PA, et al. Fellow involvement may increase adenoma detection rates during colonoscopy. Am J Gastroenterol. 2008;103:2841–2846.
77. Hornbrook MC, Goshen R, Choman E, et al. Correction to: early colorectal cancer detected by machine learning model using gender, age, and complete blood count data. Dig Dis Sci. 2018;63:270.
78. Hilsden RJ, Heitman SJ, Mizrahi B, et al. Prediction of findings at screening colonoscopy using a machine learning algorithm based on complete blood counts (ColonFlag). PLoS One. 2018;13:e0207848.
Keywords:

artificial intelligence; gastrointestinal diseases; endoscopy; detection; diagnosis

Copyright © 2020 The Author(s). Published by Wolters Kluwer Health, Inc.