Yael Shapira-Galitz MD, Galia Karp MD, Oded Cohen MD, Doron Halperin MD MHA, Yonatan Lahav MD and Nimrod Adi MD
Background: Nasal device-related pressure ulcers are scarcely addressed in the literature.
Objective: To assess the prevalence and severity of cutaneous and mucosal nasogastric tube (NGT)-associated pressure ulcers (PU) in critically ill patients and to define predictors for their formation.
Methods: A single center observational study of intensive care unit patients with a NGT for more than 48 hours was conducted. Nasal skin was evaluated for PU. Ulcers were graded according to their depth. Consenting patients underwent a nasoendoscopic examination to evaluate intranasal mucosal injury.
Results: The study comprised 50 patients, 17 of whom underwent nasoendoscopic examination. Mean time of NGT presence in the nose was 11.3 ± 6.17 days. All patients had some degree of extranasal PU, 46% were low grade and 54% were high grade. Predictors for high grade extranasal PU compared to low grade PU were higher peak Sepsis-related Organ Failure Assessment (SOFA) scores (11.52 vs. 8.87, P = 0.009), higher peak C-reactive protein (CRP) levels (265.3 mg/L vs. 207.58, P = 0.008), and bacteremia (33.3% vs. 8.7%, P = 0.037). The columella was the anatomical site most commonly involved and the most severely affected. The number of intranasal findings and their severity were significantly higher in the nasal cavity containing the NGT compared to its contralateral counterpart (P = 0.039 for both).
Conclusions: NGTs cause injury to nasal skin and mucosa in critically ill patients. Patients with bacteremia, high CRP, and high SOFA scores are at risk for severe ulcers, warranting special monitoring and preventive measures.
Ori Samuel Duek MD BSBME, Yeela Ben Naftali MD, Yaron Bar-Lavie MD, Hany Bahouth MD and Yehuda Ullmann MD
Background: Pneumonia is a major cause of morbidity and mortality in burn patients with inhalation injuries. An increased risk of pneumonia has been demonstrated in trauma and burn patients urgently intubated in the field vs. emergency departments (EDs).
Objectives: To compare intubation setting (field vs. ED) and subsequent development of pneumonia in burn patients and to evaluate the indication for urgent intubation outside the hospital setting.
Methods: A retrospective medical records review was conducted on all intubated patients presenting with thermal (study group, 118 patients) or trauma (control group A, 74 patients) injuries and admitted to the intensive care unit of a level I trauma and burn center at a single institution during a 15 year period. Control group B (50 patients) included non-intubated facial burn patients hospitalized in the plastic surgery department.
Results: Field intubation was less frequent (37% field vs. 63% ED), although it was more frequent in larger burns (total body surface area > 50%; 43% field vs. 27% ED). More field intubated patients developed pneumonia during hospitalization (65% field vs. 36% ED [burns]; 81% field vs. 45% ED [multi-trauma]; 2% non-intubated, P < 0.05), with a significantly higher all-cause mortality (49% field vs. 24% ED, P < 0.05) and dramatically lower rates of extubation within 3 days (7% field vs. 27% ED, P < 0.05).
Conclusions: Field intubation is associated with a higher risk of subsequent development of pneumonia in burn and multi-trauma patients and should be applied with caution, only when airway patency is at immediate risk.
Daphna Katz-Talmor B Med Sc, Shaye Kivity MD, Miri Blank PhD, Itai Katz B Med Sc, Ori Perry BS, Alexander Volkov MD, Iris Barshack MD, Howard Amital MD MHA, Yehuda Shoenfeld MD FRCP MaACR
Anca Leibovici MD, Rivka Sharon Msc and David Azoulay PhD
Background: Brain-derived neurotrophic factor (BDNF) is a neuronal growth factor that is important for the development, maintenance, and repair of the peripheral nervous system. The BDNF gene commonly carries a single nucleotide polymorphism (Val66Met-SNP), which affects the cellular distribution and activity-dependent secretion of BDNF in neuronal cells.
Objectives: To check the association between BDNF Val66Met-SNP as a predisposition that enhances the development of chemotherapy-induced peripheral neuropathy in an Israeli cohort of patients with breast cancer who were treated with paclitaxel.
Methods: Peripheral neuropathy symptoms were assessed and graded at baseline, before beginning treatment, and during the treatment protocol in 35 patients, using the reduced version of the Total Neuropathy Score (TNSr). Allelic discrimination of BDNF polymorphism was determined in the patients' peripheral blood by established polymerase chain reaction and Sanger sequencing.
Results: We found Val/Val in 20 patients (57.14%), Val/Met in 15 patients (42.86%), and Met/Met in none of the patients (0%). Baseline TNSr scores were higher in Met-BDNF patients compared to Val-BDNF patients. The maximal TNSr scores that developed during the follow-up in Met-BDNF patients were higher than in Val-BDNF patients. However, exclusion of patients with pre-existing peripheral neuropathy from the analysis resulted in equivalent maximal TNSr scores in Met-BDNF and Val-BDNF patients.
Conclusions: These observations suggest that BDNF Val66met-SNP has no detectable effect on the peripheral neuropathy that is induced by paclitaxel. The significance of BDNF Val66Met-SNP in pre-existing peripheral neuropathy-related conditions, such as diabetes, should be further investigated.
Sorel Goland MD, Irena Fugenfirov MD, Igor Volodarsky MD, Hadass Aronson MD, Liaz Zilberman MD, Sara Shimoni MD and Jacob George MD
Background: Early identification of patients with a likelihood of cardiac improvement has important implications for management strategies.
Objectives: To evaluate whether tissue Doppler imaging (TDI) and two-dimensional (2D) strain measures may predict left ventricular (LV) improvement in patients with recent onset dilated cardiomyopathy (ROCM).
Methods: Clinical and comprehensive echo were performed at baseline and at 6 months. Patients who achieved an increase of ≥ 10 LV ejection fraction (LVEF) units and LV reverse remodeling (LVRR) (group 1) and those who improved beyond the device threshold achieving LVEF of ≥ 0.40 (group 2) were compared to patients who did not improve to this level.
Results: Among 37 patients with ROCM (mean age 56.3 ± 12.9 years and LVEF 29.1 ± 7.0%), 48% achieved LVEF ≥ 0.40 and 37.8% demonstrated LVRR. Patients with LVEF improvement ≥ 40% presented at diagnosis with higher LVEF (P = 0.006), smaller LV end-diastolic diameter (LVEDd) (P = 0.04), higher E’ septal (P = 0.02), lower E/E’ ratio (P = 0.02), increased circumferential strain (P = 0.04), and apical rotation (P = 0.009). Apical rotation and LVEDd were found to be independent predictors of LVRR. End-systolic LV volume was a significant predictor of LVEF improvement (≥ 40%).
Conclusions: Nearly half of the patients with ROCM demonstrated cardiac function improvement beyond the device threshold by 6 months. Apical rotation was introduced in our study as 2D strain prognostic parameter and found to be an independent predictor of LVRR. LV size and volume were predictors of LV improvement.
Dvir Shalem, Asaf Shemer, Ora Shovman MD, Yehuda Shoenfeld MD FRCP MACR and Shaye Kivity MD
Background: Guillain-Barré syndrome (GBS) is an autoimmune disease of the peripheral nervous system with a typical presentation of acute paralysis and hyporeflexia. Intravenous immunoglobulin (IVIG) and plasma exchange (PLEX) are treatments that have proven to expedite recuperation and recovery of motor function.
Objectives: To describe our experience at one tertiary medical center treating GBS with IVIG and to compare the efficacy of IVIG as the sole treatment versus combined therapy of IVIG and plasma exchange.
Methods: We reviewed the records of all patients diagnosed with GBS and treated with IVIG at the Sheba Medical Center from 2007 to 2015 and collected data on patient demographics, disease onset and presentation, and treatments delivered. The motor disability grading scale (MDGS) was used to evaluate the motor function of each patient through the various stages of the disease and following therapy.
Results: MDGS improvement from admission until discharge was statistically significant (P < 0.001), as was the regainment of motor functions at 3 and 12 months follow-up compared to the status during the nadir of the disease. The effectiveness of second-line treatment with IVIG following PLEX failure and vice versa was not statistically significant (P > 0.15).
Conclusions: The majority of patients included in this study experienced a significant and rapid improvement of GBS following treatment with IVIG. Combined therapy of PLEX and IVIG was not proven to be effective in patients who encountered a failure of the first-line treatment.
Hadas Ganer Herman MD, Zviya Kogan MD, Amran Dabas MD, Ram Kerner MD, Hagit Feit MD, Shimon Ginath MD, Jacob Bar MD MsC and Ron Sagiv MD
Background: Different clinical and sonographic parameters have been suggested to identify patients with retained products of conception. In suspected cases, the main treatment is hysteroscopic removal.
Objectives: To compare clinical, sonographic, and intraoperative findings in cases of hysteroscopy for retained products of conception, according to histology.
Methods: The results of operative hysteroscopies that were conducted between 2011 and 2016 for suspected retained products of conception were evaluated. Material was obtained and evaluated histologically. The positive histology group (n=178) included cases with confirmed trophoblastic material. The negative histology group (n=26) included cases with non-trophoblastic material.
Results: Patient demographics were similar in the groups, and both underwent operative hysteroscopy an average of 7 to 8 weeks after delivery/abortion. A history of vaginal delivery was more common among the positive histology group. The main presenting symptom in all study patients was vaginal bleeding, and the majority of cases were diagnosed at their routine postpartum/abortion follow-up visit. Sonographic parameters were similar in the groups. Intraoperatively, the performing surgeon was significantly more likely to identify true trophoblastic tissue as such than to correctly identify non-trophoblastic tissue (P < 0.001).
Conclusions: Suspected retained trophoblastic material cannot be accurately differentiated from non-trophoblastic material according to clinical, sonographic, and intraprocedural criteria. Thus, hysteroscopy seems warranted in suspected cases.
Eviatar Nesher MD, Marius Braun MD, Sigal Eizner MD, Assaf Issachar MD, Michal Cohen MD, Amir Shlomai MD PhD, Michael Gurevich MD, Ran Tur-Kaspa MD and Eytan Mor MD
Background: The lack of organs for liver transplantation has prompted transplant professionals to study potential solutions, such as the use of livers from donors older than 70 years. This strategy is not widely accepted because potential risks of vascular and biliary complications and recurrence of hepatitis C.
Objectives: To examine the efficacy and safety of liver grafts from older donors for transplantation.
Methods: A retrospective analysis of data on 310 adults who underwent deceased donor liver transplantation between 2005 and 2015 was conducted. We compared graft and recipient survival, as well as major complications, of transplants performed with grafts from donors younger than 70 years (n=265, control group) and those older than 70 years (n=45, older-donor group), followed by multivariate analysis, to identify risk factors.
Results: There was no significant difference between the control and older-donor group at 1, 5, and 10 years of recipient survival (79.5% vs. 73.3%, 68.3% vs. 73.3%, 59.2% vs. 66.7%, respectively) or graft survival (74.0% vs. 71.0%, 62.7% vs. 71.0%, 54.8% vs. 64.5%, respectively). The rate of biliary and vascular complications was similar in both groups. Significant risk factors for graft failure were hepatitis C (hazard ratio [HR] = 1.92, 95% confidence interval [95%CI] 1.16–2.63), older donor age (HR = 1.02, 95%CI 1.007–1.031), and male gender of the recipient (HR = 1.65, 95%CI 1.06–2.55).
Conclusion: Donor age affects liver graft survival. However, grafts from donors older than 70 years may be equally safe if cold ischemia is maintained for less than 8 hours.
Tzvika Porges MD, Tali Shafat MD, Iftach Sagy MD, Lior Zeller MD, Carmi Bartal MD, Tamara Khutarniuk MD, Alan Jotkowitz MD and Leonid Barski MD
Background: Erythema nodosum (EN) is the most common type of panniculitis, commonly secondary to infectious diseases.
Objectives: To elucidate the causative factors and the clinical presentation of patients with EN (2004–2014) and to compare their data to those reported in a previous study.
Methods: A retrospective study was conducted of all patients diagnosed with EN who were hospitalized at Soroka University Medical Center (2004–2014). The clinical, demographic, and laboratory characteristics of the patients were compared to those in a cohort of patients diagnosed with EN from 1973–1982.
Results: The study comprised 45 patients with a diagnosis of EN. The most common symptoms of patients hospitalized with EN were arthritis or arthralgia (27% of patients). Patients with EN, compared to those reported in 1987, has significantly lower rates of fever (18% vs. 62% P < 0.001), streptococcal infection (16% vs. 44%, P = 0.003), and joint involvement (27% vs. 66%, P < 0.001). In addition, fewer patients had idiopathic causes of EN (9% vs. 32%, P = 0.006).
Conclusions: In the past decades, clinical, epidemiological, and etiological changes have occurred in EN patients. The lowering in rate of fever, streptococcal infection, and joint involvement in patients with EN are probably explained by improvements in socioeconomic conditions. The significantly decreasing rate of idiopathic causes of EN is possibly due to the greater diagnostic accuracy of modern medicine. The results of the present study demonstrate the impact of improvements in socioeconomic conditions and access to healthcare on disease presentation.
Raviv Allon BsC, Yahav Levy MD, Idit Lavi MA, Aviv Kramer MD, Menashe Barzilai MD and Ronit Wollstein MD
Because fragility fractures have an enormous impact on the practice of medicine and global health systems, effective screening is imperative. Currently, dual-energy X-ray absorptiometry (DXA), which has limited ability to predict fractures, is being used. We evaluated the current literature for a method that may constitute a better screening method to predict fragility fractures. A systematic review of the literature was conducted on computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound to evaluate screening methods to predict fragility fractures. We found that ultrasound had sufficient data on fracture prediction to perform meta-analysis; therefore, we analyzed prospective ultrasound cohort studies. Six study populations, consisting of 29,299 individuals (87,296 person-years of observation) and including 992 fractures, were analyzed. MRI was found to be sensitive and specific for osteoporosis, but its use for screening has not been sufficiently evaluated and more research is needed on cost, accessibility, technical challenges, and sensitivity and specificity. CT could predict fracture occurrence; however, it may be problematic for screening due to cost, exposure to radiation, and availability. Ultrasound was found to predict fracture occurrence with an increased risk of 1.45 (95% confidence interval 1.21–1.73) to fracture. Ultrasound has not replaced DXA as a screening tool for osteoporosis, perhaps due to operator-dependency and difficulty in standardization of testing.
Kassem Sharif MD, Louis Coplan MD, Benjamin Lichtbroun MD and Howard Amital MD MHA