Daphna Katz-Talmor B Med Sc, Shaye Kivity MD, Miri Blank PhD, Itai Katz B Med Sc, Ori Perry BS, Alexander Volkov MD, Iris Barshack MD, Howard Amital MD MHA, Yehuda Shoenfeld MD FRCP MaACR
Sorel Goland MD, Irena Fugenfirov MD, Igor Volodarsky MD, Hadass Aronson MD, Liaz Zilberman MD, Sara Shimoni MD and Jacob George MD
Background: Early identification of patients with a likelihood of cardiac improvement has important implications for management strategies.
Objectives: To evaluate whether tissue Doppler imaging (TDI) and two-dimensional (2D) strain measures may predict left ventricular (LV) improvement in patients with recent onset dilated cardiomyopathy (ROCM).
Methods: Clinical and comprehensive echo were performed at baseline and at 6 months. Patients who achieved an increase of ≥ 10 LV ejection fraction (LVEF) units and LV reverse remodeling (LVRR) (group 1) and those who improved beyond the device threshold achieving LVEF of ≥ 0.40 (group 2) were compared to patients who did not improve to this level.
Results: Among 37 patients with ROCM (mean age 56.3 ± 12.9 years and LVEF 29.1 ± 7.0%), 48% achieved LVEF ≥ 0.40 and 37.8% demonstrated LVRR. Patients with LVEF improvement ≥ 40% presented at diagnosis with higher LVEF (P = 0.006), smaller LV end-diastolic diameter (LVEDd) (P = 0.04), higher E’ septal (P = 0.02), lower E/E’ ratio (P = 0.02), increased circumferential strain (P = 0.04), and apical rotation (P = 0.009). Apical rotation and LVEDd were found to be independent predictors of LVRR. End-systolic LV volume was a significant predictor of LVEF improvement (≥ 40%).
Conclusions: Nearly half of the patients with ROCM demonstrated cardiac function improvement beyond the device threshold by 6 months. Apical rotation was introduced in our study as 2D strain prognostic parameter and found to be an independent predictor of LVRR. LV size and volume were predictors of LV improvement.
Eviatar Nesher MD, Marius Braun MD, Sigal Eizner MD, Assaf Issachar MD, Michal Cohen MD, Amir Shlomai MD PhD, Michael Gurevich MD, Ran Tur-Kaspa MD and Eytan Mor MD
Background: The lack of organs for liver transplantation has prompted transplant professionals to study potential solutions, such as the use of livers from donors older than 70 years. This strategy is not widely accepted because potential risks of vascular and biliary complications and recurrence of hepatitis C.
Objectives: To examine the efficacy and safety of liver grafts from older donors for transplantation.
Methods: A retrospective analysis of data on 310 adults who underwent deceased donor liver transplantation between 2005 and 2015 was conducted. We compared graft and recipient survival, as well as major complications, of transplants performed with grafts from donors younger than 70 years (n=265, control group) and those older than 70 years (n=45, older-donor group), followed by multivariate analysis, to identify risk factors.
Results: There was no significant difference between the control and older-donor group at 1, 5, and 10 years of recipient survival (79.5% vs. 73.3%, 68.3% vs. 73.3%, 59.2% vs. 66.7%, respectively) or graft survival (74.0% vs. 71.0%, 62.7% vs. 71.0%, 54.8% vs. 64.5%, respectively). The rate of biliary and vascular complications was similar in both groups. Significant risk factors for graft failure were hepatitis C (hazard ratio [HR] = 1.92, 95% confidence interval [95%CI] 1.16–2.63), older donor age (HR = 1.02, 95%CI 1.007–1.031), and male gender of the recipient (HR = 1.65, 95%CI 1.06–2.55).
Conclusion: Donor age affects liver graft survival. However, grafts from donors older than 70 years may be equally safe if cold ischemia is maintained for less than 8 hours.
Tzvika Porges MD, Tali Shafat MD, Iftach Sagy MD, Lior Zeller MD, Carmi Bartal MD, Tamara Khutarniuk MD, Alan Jotkowitz MD and Leonid Barski MD
Background: Erythema nodosum (EN) is the most common type of panniculitis, commonly secondary to infectious diseases.
Objectives: To elucidate the causative factors and the clinical presentation of patients with EN (2004–2014) and to compare their data to those reported in a previous study.
Methods: A retrospective study was conducted of all patients diagnosed with EN who were hospitalized at Soroka University Medical Center (2004–2014). The clinical, demographic, and laboratory characteristics of the patients were compared to those in a cohort of patients diagnosed with EN from 1973–1982.
Results: The study comprised 45 patients with a diagnosis of EN. The most common symptoms of patients hospitalized with EN were arthritis or arthralgia (27% of patients). Patients with EN, compared to those reported in 1987, has significantly lower rates of fever (18% vs. 62% P < 0.001), streptococcal infection (16% vs. 44%, P = 0.003), and joint involvement (27% vs. 66%, P < 0.001). In addition, fewer patients had idiopathic causes of EN (9% vs. 32%, P = 0.006).
Conclusions: In the past decades, clinical, epidemiological, and etiological changes have occurred in EN patients. The lowering in rate of fever, streptococcal infection, and joint involvement in patients with EN are probably explained by improvements in socioeconomic conditions. The significantly decreasing rate of idiopathic causes of EN is possibly due to the greater diagnostic accuracy of modern medicine. The results of the present study demonstrate the impact of improvements in socioeconomic conditions and access to healthcare on disease presentation.
Raviv Allon BsC, Yahav Levy MD, Idit Lavi MA, Aviv Kramer MD, Menashe Barzilai MD and Ronit Wollstein MD
Because fragility fractures have an enormous impact on the practice of medicine and global health systems, effective screening is imperative. Currently, dual-energy X-ray absorptiometry (DXA), which has limited ability to predict fractures, is being used. We evaluated the current literature for a method that may constitute a better screening method to predict fragility fractures. A systematic review of the literature was conducted on computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound to evaluate screening methods to predict fragility fractures. We found that ultrasound had sufficient data on fracture prediction to perform meta-analysis; therefore, we analyzed prospective ultrasound cohort studies. Six study populations, consisting of 29,299 individuals (87,296 person-years of observation) and including 992 fractures, were analyzed. MRI was found to be sensitive and specific for osteoporosis, but its use for screening has not been sufficiently evaluated and more research is needed on cost, accessibility, technical challenges, and sensitivity and specificity. CT could predict fracture occurrence; however, it may be problematic for screening due to cost, exposure to radiation, and availability. Ultrasound was found to predict fracture occurrence with an increased risk of 1.45 (95% confidence interval 1.21–1.73) to fracture. Ultrasound has not replaced DXA as a screening tool for osteoporosis, perhaps due to operator-dependency and difficulty in standardization of testing.
Yuval Raveh MD, Tawfik Khoury MD, Moshe Lachish MD, Rifaat Safadi MD and Yoav Kohn MD