Text sizing:
A
A
A
European Journal of Gastroenterology & Hepatology:
doi: 10.1097/MEG.0b013e32835cb864
Paper alert

Paper alert

Section Editor(s): Hayes, Peter; Plevris, John

Free Access
Article Outline
Collapse Box

Author Information

The Royal Infirmary, Edinburgh EH3 9YW, UK

A selection of interesting papers that were published in the month before our press date in major journals likely to report important results in gastroenterology and hepatology.

Back to Top | Article Outline

Hormone therapy increases risk of ulcerative colitis but not Crohn's disease

Hamed Khalili, Leslie M. Higuchi, Ashwin N. Ananthakrishnan, JoAnn E. Manson, Diane Feskanich, James M. Richter, Charles S. Fuchs, Andrew T. Chan

Background and aims: Estrogen has been proposed to modulate gut inflammation through an effect on estrogen receptors found on gastrointestinal epithelial and immune cells. The role of postmenopausal hormone therapy on risk of Crohn's disease (CD) and ulcerative colitis (UC) is unclear.

Methods: We conducted a prospective cohort study of 108 844 postmenopausal US women (median age, 54 years) enrolled in 1976 in the Nurses’ Health Study without a prior history of CD or UC. Every 2 years, we have updated information on menopause status, postmenopausal hormone use, and other risk factors. Self-reported diagnoses of CD and UC were confirmed through medical record review by 2 gastroenterologists who were blinded to exposure information. We used Cox proportional hazards models to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs).

Results: Through 2008, with more than 1.8 million person-years of follow-up, we documented 138 incident cases of CD and 138 cases of UC. Compared with women who never used hormones, the multivariate-adjusted HR for UC was 1.71 (95% CI, 1.07–2.74) among women who currently used hormones and 1.65 (95% CI, 1.03–2.66) among past users. The risk of UC appeared to increase with longer duration of hormone use (Ptrend=0.04) and decreased with time since discontinuation. There was no difference in risk according to the type of hormone therapy used (estrogen vs estrogen plus progestin). In contrast, we did not observe an association between current use of hormones and risk of CD (multivariate-adjusted HR, 1.19; 95% CI, 0.78–1.82). The effect of hormones on risk of UC and CD was not modified by age, body mass index, or smoking.

Conclusion: In a large prospective cohort of women, postmenopausal hormone therapy was associated with an increased risk of UC but not CD. These findings indicate that pathways related to estrogens might mediate the pathogenesis of UC.

Gastroenterology 2012; 143(5):1199–1206.

Back to Top | Article Outline

Effect of azithromycin on acid reflux, hiatus hernia and proximal acid pocket in the postprandial period

W.O. Rohof, R.J. Bennink, A.A. de Ruigh, D.P. Hirsch, A.H. Zwinderman, G.E. Boeckxstaens

Background: The risk for acidic reflux is mainly determined by the position of the gastric acid pocket. It was hypothesised that compounds affecting proximal stomach tone might reduce gastro-oesophageal reflux by changing the acid pocket position.

Objective: To study the effect of azithromycin (Azi) on acid pocket position and acid exposure in patients with gastro-oesophageal reflux disease (GORD).

Methods: Nineteen patients with GORD were included, of whom seven had a large hiatal hernia (≥3 cm) (L-HH) and 12 had a small or no hiatal hernia (S-HH). Patients were randomised to Azi 250 mg/day or placebo during 3 days in a crossover manner. On each study day, reflux episodes were detected using concurrent high-resolution manometry and pH-impedance monitoring after a standardised meal. The acid pocket was visualised using scintigraphy, and its position was determined relative to the diaphragm.

Results: Azi reduced the number of acid reflux events (placebo 8.0±2.2 vs. Azi 5.6±1.8, P<0.01) and postprandial acid exposure (placebo 10.5±3.8% vs. Azi 5.9±2.5%, P<0.05) in all patients without affecting the total number of reflux episodes. Acid reflux occurred mainly when the acid pocket was located above, or at the level of, the diaphragm, rather than below the diaphragm. Treatment with Azi reduced hiatal hernia size and resulted in a more distal position of the acid pocket compared with placebo (below the diaphragm 39% vs. 29%, P=0.03). Azi reduced the rate of acid reflux episodes in patients with S-HH (38% to 17%) to a greater extent than in patients with L-HH (69% to 62%, P=0.04).

Conclusion: Azi reduces acid reflux episodes and oesophageal acid exposure. This effect was associated with a smaller hiatal hernia size and a more distal position of the acid pocket, further indicating the importance of the acid pocket in the pathogenesis of GORD.

Gut 2012; 61:1670–1677. DOI:10.1136/gutjnl-2011–300926.

Back to Top | Article Outline

Diagnostic yield of small-bowel capsule endoscopy in patients with iron-deficiency anemia: a systematic review

Anastasios Koulaouzidis, Emanuele Rondonotti, Andry Giannakou, John N. Plevris

Background: Iron-deficiency anemia (IDA) is the most common cause of anemia worldwide. Current guidelines recommend the use of small-bowel capsule endoscopy (SBCE) in IDA. Evidence of the validity of SBCE in patients with IDA alone is still limited.

Objective: To assess the diagnostic yield (DY) of SBCE in IDA by pooling data from relevant studies.

Design: Systematic review and meta-analysis. Fixed-effects or random-effects models were used as appropriate.

Setting: Studies that estimated the DY of SCBE in IDA were identified. Two investigators independently conducted the search and data extraction.

Patients: A total of 24 studies enrolling 1960 patients with IDA who underwent SBCE were included.

Main outcome measurements: Per-patient DY, with 95% confidence intervals. Subgroup analysis was also performed.

Results: The pooled DY of SBCE in IDA, evaluated by a random-effects model, was 47% (95% CI, 42–52%), but there was statistically significant heterogeneity among the included studies (inconsistency index [I2]=78.8%, P<0.0001). The pooled DY of SBCE in studies focused solely on patients with IDA (subset 1, 4 studies) was 66.6% (95% CI, 61.0–72.3%; I2=44.3%); conversely, that of studies not focusing only on IDA patients (subset 2, 20 studies) was 44% (95% CI, 39–48%; I2=64.9%). In particular, more vascular (31% vs. 22.6%, P=0.007), inflammatory (17.8% vs. 11.3%, P=0.009), and mass/tumor (7.95% vs. 2.25%, P<0.0001) lesions were detected with SBCE in patients participating in the studies in subset 1.

Limitations: Heterogeneity of studies, retrospective design, and selection bias.

Conclusion: This analysis demonstrates the validity of SBCE in the investigation of patients with IDA and negative findings on a previous diagnostic workup, although certain factors such as heterogeneity and quality of the included studies should be taken into account.

Gastrointestinal Endoscopy 2012; 76(5):983–992.

Back to Top | Article Outline

Colorectal polypectomy during insertion and withdrawal or only during withdrawal? A randomized controlled trial

S.M. Wildi, A.M. Schoepfer, S.R. Vavricka, H. Fruehauf, E. Safroneeva, N. Wiegand, P. Bauerfeind, M. Fried

Background and study aims: Removal of colorectal polyps is routinely performed during withdrawal of the endoscope. However, polyps detected during insertion of the colonoscope may be missed at withdrawal. We aimed to evaluate whether polypectomy during both insertion and withdrawal increases polyp detection and removal rates compared with polypectomy at withdrawal only, and to assess the duration of both approaches.

Patients and methods: Patients were included into the study when the first polyp was detected, and randomized into two groups; in group A, polyps ≤10 mm in diameter were removed during insertion and withdrawal of the colonoscope, while in group B, these polyps were removed at withdrawal only. Main outcome measures were duration of colonoscopy, number of polyps detected during insertion but not recovered during withdrawal, technical ease, patient discomfort, and complications.

Results: 150 patients were randomized to group A and 151 to group B. Mean (±standard deviation [SD]) duration of colonoscopy did not differ between the groups (30.8±15.6 min [A] vs. 28.5±13.8 min [B], P=0.176). In group A 387 polyps (mean 2.58 per colonoscopy) were detected and removed compared with 389 polyps detected (mean 2.58 per colonoscopy) in group B of which 376 were removed (13 polyps were missed, mean size [SD] 3.2 [1.3] mm; 7.3% of patients). Patient tolerance was similar in the two groups.

Conclusion: Removal of polyps ≤10 mm during withdrawal only is associated with a considerable polyp miss rate. We therefore recommend that these polyps are removed during both insertion and withdrawal.

Endoscopy 2012; 44(11):1019–1023. DOI: 10.1055/s-0032–1310237.

Back to Top | Article Outline

A novel and validated prognostic index in hepatocellular carcinoma: the inflammation based index (IBI)

David J. Pinato, Justin Stebbing, Mitsuru Ishizuka, Shahid A. Khan, Harpreet S. Wasan, Bernard V. North, Keiichi Kubota, Rohini Sharma

Background and aims: Outcome prediction is uniquely different in hepatocellular carcinoma (HCC) as the progressive functional impairment of the liver impacts patient survival independently of tumour stage. As chronic inflammation is associated with the pathogenesis of HCC, we explored the prognostic impact of a panel of inflammatory based scores, including the modified Glasgow Prognostic Score (mGPS), neutrophil to lymphocyte ratio (NLR) and platelet to lymphocyte ratio (PLR), in independent cohorts.

Methods: Inflammatory markers, Barcelona Clinic Liver Cancer (BCLC) and Cancer of the Liver Italian Program (CLIP) scores were studied in a training set of 112 patients with predominantly unresectable HCC (75%). Independent predictors of survival identified in multivariate analysis were validated in an independent cohort of 466 patients with an overall lower tumour burden (BCLC-A, 56%).

Results: In both training and validation sets, mGPS and CLIP scores emerged as independent predictors of overall survival. The predictive accuracy of the combined mGPS and CLIP score (c score 0.7, 95% CI, 0.6–0.8) appeared superior to that of the CLIP score alone (c score 0.6, 95% CI, 0.5–0.7).

Conclusion: Systemic inflammation as measured by the mGPS, independently predicts overall survival in HCC. We have validated a novel, easy to use inflammatory score that can be used to stratify individuals. These data enable formulation of a new prognostic system, the inflammation based index in HCC (IBI). Further validation of the IBI considering treatment allocation and survival is warranted in an independent patient cohort.

Journal of Hepatology 2012; 57(5):1013–1020. DOI: 10.1002/hep.25848.

Back to Top | Article Outline

Vitamin D binding protein gene polymorphisms and baseline vitamin D levels as predictors of antiviral response in chronic hepatitis C

Edmondo Falleti, Davide Bitetto, Carlo Fabris, Giovanna Fattovich, Annarosa Cussigh, Sara Cmet, Elisa Ceriani, Ezio Fornasiere, Michela Pasino, Donatella Ieluzzi, Mario Pirisi, Pierluigi Toniutto

Vitamin D deficiency seems to predict the unsuccessful achievement of sustained viral response (SVR) after antiviral treatment in hepatitis C virus (HCV) difficult-to-treat genotypes. Vitamin D binding protein (GC) gene polymorphisms are known to influence vitamin D levels. This study was performed to assess whether the interaction between basal circulating vitamin D and the GC polymorphism plays a role in influencing the rate of antiviral responses in patients affected by chronic hepatitis C. In all, 206 HCV patients treated with a combination therapy of pegylated (PEG)-interferon plus ribavirin were retrospectively evaluated. GC rs7041 G>T, GC rs4588 C>A, and IL-28B rs12979860 C>T polymorphisms were genotyped. Frequencies of GC rs7041 G>T and rs4588 C>A polymorphisms were: G/G=64 (31.1%), G/T=100 (48.5%), T/T=42 (20.4%) and C/C=108 (52.4%), C/A=84 (40.8%), A/A=14 (6.8%). Patients were divided into those carrying ≥3 major alleles (wildtype [WT]+: G-C/G-C, G-C/T-C, G-C/G-A, N=100) and the remaining (WT−: G-C/T-A, T-A/T-C, T-A/T-A, T-C/T-C, N=106). Four groups were identified: vitamin D ≤20 ng/ml and WT−, vitamin D ≤20 and WT+, vitamin D>20 and WT−, vitamin D>20 and WT+. In difficult-to-treat HCV genotypes the proportion of patients achieving SVR significantly increased with a linear trend from the first to the last group: 6/25 (24.0%), 9/24 (37.5%), 12/29 (41.4%), 19/29 (65.5%) (P=0.003). At multivariate analysis, having basal vitamin D >20 ng/ml plus the carriage of GC WT+ was found to be an independent predictor of SVR (odds ratio 4.52, P=0.015).

Conclusion: In difficult-to-treat HCV genotypes, simultaneous pretreatment normal serum vitamin D levels and the carriage of GC-globulin WT isoform strongly predicts the achievement of SVR after PEG-interferon plus ribavirin antiviral therapy.

Hepatology 2012; 56(5):1641–1650.

Back to Top | Article Outline

Development of an accurate index for predicting outcomes of patients with acute liver failure

Anna Rutherford, Lindsay Y. King, Linda S. Hynan, Chetan Vedvyas, Wenyu Lin, William M. Lee, Raymond T. Chung

Background and aims: Patients with acute liver failure (ALF) have high mortality and frequently require liver transplantation (LT); few reliable prognostic markers are available. Levels of M30, a cleavage product of cytokeratin-18 caspase, are significantly increased in serum samples from patients with ALF who die or undergo LT. We developed a prognostic index for ALF based on level of M30 and commonly measured clinical variables (called the Acute Liver Failure Study Group [ALFSG] index) and compared its accuracy with that of the King's College criteria (KCC) and Model for End Stage Liver Disease (MELD). We also validated our model in an independent group of patients with ALF.

Methods: Serum levels of M30 and M65 antigen (the total cytokeratin-18 fragment, a marker of apoptosis and necrosis) were measured on 3 of the first 4 days following admission of 250 patients with ALF. Logistic regression was used to determine whether the following factors, measured on day 1, were associated with LT or death: age, etiology; coma grade; international normalized ratio (INR); serum pH; body mass index; levels of creatinine, bilirubin, phosphorus, arterial ammonia, and lactate; and log10 M30 and log10 M65. The area under the receiver operating characteristic (AUROC) was calculated for the ALFSG and other indices.

Results: Coma grade, INR, levels of bilirubin and phosphorus, and log10 M30 value at study entry most accurately identified patients who would require LT or die. The ALFSG index identified these patients with 85.6% sensitivity and 64.7% specificity. Based on comparison of AUROC values, the ALFSG Index (AUROC, 0.822) better identified patients most likely to require LT or die than the KCC (AUROC, 0.654) or MELD (AUROC, 0.704) (P=0.0002 and P=0.0010, respectively). We validated these findings in a separate group of 250 patients with ALF.

Conclusion: The ALFSG index, a combination of clinical markers and measurements of the apoptosis biomarker M30, better predicts outcomes of patients with ALF than the KCC or MELD; ClinicalTrials.gov, number NCT00518440.

Gastroenterology 2012; 143(5):1237–1243.

Back to Top | Article Outline

Patients transplanted for nonalcoholic steatohepatitis are at increased risk for postoperative cardiovascular events

Lisa B. VanWagner, Manali Bhave, Helen S. Te, Joe Feinglass, Lisa Alvarez, Mary E. Rinella

Nonalcoholic steatohepatitis (NASH) is an independent predictor of coronary artery disease (CAD). Our aim was to compare the incidence of cardiovascular (CV) events between patients transplanted for NASH and alcohol (ETOH)-induced cirrhosis. This is a retrospective cohort study (August 1993 to March 2010) of 242 patients (115 NASH and 127 ETOH) with ≥12 months follow-up after liver transplantation (LT). Those with hepatocellular carcinoma or coexisting liver diseases were excluded. Kaplan-Meier's and Cox’s proportional hazard analyses were conducted to compare survival. Logistic regression was used to calculate the likelihood of CV events, defined as death from any cardiac cause, myocardial infarction, acute heart failure, cardiac arrest, arrhythmia, complete heart block, and/or stroke requiring hospitalization <1 year after LT. Patients in the NASH group were older (58.4 vs. 53.3 years) and were more likely to be female (45% vs. 18%; P<0.001). They were more likely to be morbidly obese (32% vs. 9%), have dyslipidemia (25% vs. 6%), or have hypertension (53% vs. 38%; P<0.01). On multivariate analysis, NASH patients were more likely to have a CV event <1 year after LT, compared to ETOH patients, even after controlling for recipient age, sex, smoking status, pretransplant diabetes, CV disease, and the presence of metabolic syndrome (26% vs. 8%; odds ratio=4.12; 95% confidence interval=1.91–8.90). The majority (70%) of events occurred in the perioperative period, and the occurrence of a CV event was associated with a 50% overall mortality. However, there were no differences in patient, graft, or CV mortality between groups.

Conclusion: CV complications are common after LT, and NASH patients are at increased risk independent of traditional cardiac risk factors, though this did not affect overall mortality.

Hepatology 2012; 56(5):1741–1750. DOI: 10.1002/hep.25855.

© 2013 Lippincott Williams & Wilkins, Inc.

Login