• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Tue, 26.11.24

Search results


September 2018
Nir Lubezky MD, Ido Nachmany MD, Yaacov Goykhman MD, Yanai Ben-Gal MD, Yoram Menachem MD, Ravit Geva MD, Joseph M Klausner MD and Richard Nakache MD
Anna Gurevich-Shapiro MD MPhil, Yotam Pasternak MD and Jacob N. Ablin MD
August 2018
Einat Slonimsky, Osnat Konen, Elio Di Segni, Eliyahu Konen and Orly Goitein

Background: Correct diagnosis of cardiac masses is a challenge in clinical practice. Accurate identification and differentiation between cardiac thrombi and tumors is crucial because prognosis and appropriate clinical management vary substantially.

Objectives: To evaluate the diagnostic performances of cardiac magnetic resonance imaging (CMR) in differentiating between cardiac thrombi and tumors.

Methods: A retrospective review of a prospectively maintained database of all CMR scans was performed to distinguish between cardiac thrombi and tumors during a 10 year period in a single academic referral center (2004–2013). Cases with an available standard of reference for a definite diagnosis were included. Correlation of CMR differentiation between thrombi and tumors with an available standard of reference was performed. Sensitivity, specificity, negative predictive value (NPV), positive predictive value (PPV), and accuracy were reported.

Results: In this study, 101 consecutive patients underwent CMR for suspicious cardiac masses documented on transthoracic or transesophageal echocardiography. CMR did not detect any cardiac pathology in 17% (17/101), including detection of anatomical variants and benign findings in 18% (15/84). Of the remaining 69 patients, CMR diagnosis was correlated with histopathologic result in 74% (51/69), imaging follow-up in 22% (15/69), and a definite CMR diagnosis (lipoma) in 4% (3/69). For tumors, diagnostic accuracy, sensitivity, specificity, PPV, and NPV were 96.6%, 98%, 86.6%, 96.2%, and 96.6%, respectively. For thrombi, the results were 93.6%, 86.7%, 98.04%, 92.9%, and 97%, respectively.

Conclusions: CMR is highly accurate in differentiating cardiac thrombi from tumors and should be included in the routine evaluation of cardiac masses.

Salim Halabi MD, Awny Elias MD, Michael Goldberg MD, Hilal Hurani MD, Husein Darawsha MD, Sharon Shachar MA and Miti Ashkenazi RN MPH

Background: Door-to-balloon time (DTBT) ≤ 90 minutes has become an important quality indicator in the management of ST-elevation myocardial infarction (STEMI). We identified three specific problems in the course from arrival of STEMI patients at our emergency department to initiation of balloon inflation and determined an intervention comprised of specific administrative and professional steps. The focus of the intervention was on triage within the emergency department (ED) and on increasing the efficiency and accuracy of electrocardiography interpretation.

Objectives: To examine whether our intervention reduced the proportion of patients with DTBT > 90 minutes.

Methods: We compared DTBT of patients admitted to the ED with STEMI during the year preceding and the year following implementation of the intervention.

Results: Demographic and clinical characteristics at presentation to the ED were similar for patients admitted to the ED in the year preceding and the year following intervention. The year preceding intervention, DTBT was > 90 minutes for 19/78 patients (24%). The year after intervention, DTBT was > 90 minutes for 17/102 patients (17%). For both years, the median DTBT was 1 hour. Patients with DTBT > 90 minutes tended to be older and more often female. Diagnoses in the ED were similar between those with DTBT ≤ 90 minutes and > 90 minutes. In-hospital mortality was 17% (13/78) and 14% (14/102) for the respective time periods.

Conclusions: An intervention specifically designed to address problems identified at one medical center was shown to decrease the proportion of patients with DTBT > 90 minutes.

Avi Porath MD MPH, Jonathan Eli Arbelle MD MHA, Naama Fund, Asaf Cohen and Morris Mosseri MD FESC

Background: The salutary effects of statin therapy in patients with cardiovascular disease (CVD) are well established. Although generally considered safe, statin therapy has been reported to contribute to induction of diabetes mellitus (DM).

Objectives: To assess the risk-benefit of statin therapy, prescribed for the prevention of CVD, in the development of DM.

Methods: In a population-based real-life study, the incidence of DM and CVD were assessed retrospectively among 265,414 subjects aged 40–70 years, 17.9% of whom were treated with statins. Outcomes were evaluated according to retrospectively determined baseline 10 year cardiovascular (CV) mortality risks as defined by the European Systematic COronary Risk Evaluation, statin dose-intensity regimen, and level of drug adherence.

Results: From 2010 to 2014, 5157 (1.9%) new cases of CVD and 11,637 (4.4%) of DM were observed. Low-intensity statin therapy with over 50% adherence was associated with increased DM incidence in patients at low or intermediate baseline CV risk, but not in patients at high CV risk. In patients at low CV risk, no CV protective benefit was obtained. The number needed to harm (NNH; incident DM) for low-intensity dose regimens with above 50% adherence was 40. In patients at intermediate and high CV risk, the number needed to treat was 125 and 29; NNH was 50 and 200, respectively.

Conclusions: Prescribing low-dose statins for primary prevention of CVD is beneficial in patients at high risk and may be detrimental in patients at low CV risk. In patients with intermediate CV risk, our data support current recommendations of individualizing treatment decisions.

Anan Younis MD, Dov Freimark MD, Robert Klempfner MD, Yael Peled MD, Yafim Brodov MD, Ilan Goldenberg MD and Michael Arad MD

Background: Cardiac damage caused by oncological therapy may manifest early or many years after the exposure.

Objectives: To determine the differences between sub-acute and late-onset cardiotoxicity in left ventricular ejection fraction (LVEF) recovery as well as long-term prognosis.

Methods: We studied 91 patients diagnosed with impaired systolic function and previous exposure to oncological therapy. The study population was divided according to sub-acute (from 2 weeks to ≤ 1 year) and late-onset (> 1 year) presentation cardiotoxicity. Recovery of LVEF of at least 50% was defined as the primary end point and total mortality was the secondary end point.

Results: Fifty-three (58%) patients were classified as sub-acute, while 38 (42%) were defined as late-onset cardiotoxicity. Baseline clinical characteristics were similar in the two groups. The mean LVEF at presentation was significantly lower among patients in the late-onset vs. sub-acute group (28% vs. 37%, respectively, P < 0.001). Independent predictors of LVEF recovery were trastuzumab therapy and a higher baseline LVEF. Although long-term mortality rates were similar in the groups with sub-acute and late-onset cardiotoxicity, improvement of LVEF was independently associated with reduced mortality.

Conclusions: Our findings suggest that early detection and treatment of oncological cardiotoxicity play an important role in LVEF recovery and long-term prognosis.

Yael Shachor-Meyouhas MD, Orna Eluk RN, Yuval Geffen PhD, Irena Ulanovsky MD, Tatiana Smolkin MD, Shraga Blazer MD, Iris Stein RN and Imad Kassis MD

Background: Methicillin-resistant Staphylococcus aureus (MRSA) has emerged as a challenging nosocomial pathogen in the last 50 years.

Objectives: To describe an investigation and containment of an MRSA outbreak in a neonatal intensive care unit (NICU).

Methods: Our NICU is a 25-bed level III unit. Almost 540 neonates are admitted yearly. The index case was an 8 day old term baby. MRSA was isolated from his conjunctiva. Immediate infection control measures were instituted, including separation of MRSA+ carriers, strict isolation, separate nursing teams, and screening of all infants for MRSA. Healthcare workers and parents of positive cases were screened and re-educated in infection control measures. New admissions were accepted to a clean room and visiting was restricted. MRSA isolates were collected for molecular testing.

Results: MRSA was isolated from five infants by nasal and rectal swabs, including the index case. Screening of healthcare workers and families was negative. Two MRSA+ patients already known in the pediatric intensive care unit (PICU) located near the NICU were suspected of being the source. All NICU isolates were identical by pulsed-field gel electrophoresis but were different from the two PICU isolates. The NICU and one of the PICU isolates were defined as ST-5 strain by multilocus sequence typing. One PICU isolate was ST-627. All NICU isolates were Panton–Valentine leukocidin negative and SCCmec type IV. No further cases were detected, and no active infections occurred.

Conclusions: A strict infection control policy and active screening are essential in aborting outbreaks of MRSA in the NICU.

Amihai Rottenstreich MD, Adi Schwartz, Yosef Kalish MD, Ela Shai PhD, Liat Appelbaum MD, Tali Bdolah-Abram and Itamar Sagiv MD

Background: Risk factors for bleeding complications after percutaneous kidney biopsy (PKB) and the role of primary hemostasis screening are not well established.

Objectives: To determine the role of primary hemostasis screening and complication outcomes among individuals who underwent PKB.

Methods: We reviewed data of 456 patients who underwent PKB from 2010 to 2016 in a large university hospital. In 2015, bleeding time (BT) testing was replaced by light transmission aggregometry (LTA) as a pre-PKB screening test.

Results: Of the 370 patients who underwent pre-PKB hemostasis screening by BT testing, prolonged BT was observed in 42 (11.3%). Of the 86 who underwent LTA, an abnormal response was observed in 14 (16.3%). Overall, 155 (34.0%) patients experienced bleeding: 145 (31.8%) had minor events (hemoglobin fall of 1–2 g/dl, macroscopic hematuria, perinephric hematoma without the need for transfusion or intervention) and 17 (3.7%) had major events (hemoglobin fall > 2 g/dl, blood transfusion or further intervention). Abnormal LTA response did not correlate with bleeding (P = 0.80). In multivariate analysis, only prolonged BT (P = 0.0001) and larger needle size (P = 0.005) were identified as independent predictors of bleeding.

Conclusions: Bleeding complications following PKB were common and mostly minor, and the risk of major bleeding was low. Larger needle size and prolonged BT were associated with a higher bleeding risk. Due to the relatively low risk of major bleeding and lack of benefit of prophylactic intervention, the use of pre-PKB hemostasis screening remains unestablished.

Ohad Gluck MD, Liliya Tamayev MD, Maya Torem MD, Jacob Bar MD, Arieh Raziel MD and Ron Sagiv MD

Background: Laparoscopic salpingectomy is strongly related to successful in vitro fertilization (IVF) treatments.

Objectives: To compare the ovarian reserve, including anti-mullerian hormone (AMH) levels, in patients who underwent salpingectomy before IVF to IVF patients who had not been salpingectomized.

Methods: In this retrospective study, medical records of women who were treated by the IVF unit at our institute were reviewed. We retrieved demographic data, surgical details, and data regarding the ovarian reserve. Details of 35 patients who were treated by IVF after salpingectomy were compared to 70 IVF patients with no history of salpingectomy treatment. Nine women underwent IVF treatment before and after having salpingectomy, and their details were included in both groups.

Results: The levels of AMH, follicular stimulating hormone (FSH), estradiol, and progesterone were not significantly different in the groups. The antral follicular count (AFC), number of oocytes retrieved, amount of gonadotropin administered for ovarian stimulation, and number of embryos transferred (ET) were also not significantly different.

Conclusions: Salpingectomy does not seem to affect ovarian reserve in IVF patients.

Amichai Perlman MD, Samuel N Heyman MD, Joshua Stokar MD, David Darmon MD, Mordechai Muszkat MD and Auryan Szalat MD

Background: Sodium-glucose cotransporter 2 inhibitors (SGLT2i) (such as canagliflozin, empagliflozin, and dapagliflozin) are widely used to treat patients with type 2 diabetes mellitus (T2DM) to improve glycemic, cardiovascular and renal outcomes. However, based on post-marketing data, a warning label was added regarding possible occurrence of acute kidney injury (AKI).

Objectives: To describe the clinical presentation of T2DM patients treated with SGLT2i who were evaluated for AKI at our institution and to discuss the potential pathophysiologic mechanisms.

Methods: A retrospective study of a computerized database was conducted of patients with T2DM who were hospitalized or evaluated for AKI while receiving SGLT2i, including descriptions of clinical and laboratory characteristics, at our institution.

Results: We identified seven patients in whom AKI occurred 7–365 days after initiation of SGLT2i. In all cases, renin-angiotensin-aldosterone system blockers had also been prescribed. In five patients, another concomitant nephrotoxic agent (injection of contrast-product, use of nonsteroidal anti-inflammatory drugs or cox-2 inhibitors) or occurrence of an acute medical event potentially associated with AKI (diarrhea, sepsis) was identified. In two patients, only the initiation of SGLT2i was evident. The mechanisms by which AKI occurs under SGLT2i are discussed with regard to the associated potential triggers: altered trans-glomerular filtration or, alternatively, kidney medullary hypoxia.

Conclusions: SGLT2i are usually safe and provide multiple benefits for patients with T2DM. However, during particular medical circumstances, and in association with usual co-medications, particularly if baseline glomerular filtration rate is decreased, patients treated with SGLT2i may be at risk of AKI, thus warranting caution when prescribed.

Jurgen Sota MD, Antonio Vitale MD, Donato Rigante MD PhD, Ida Orlando MD, Orso Maria Lucherini PhD, Antonella Simpatico MD, Giuseppe Lopalco MD, Rossella Franceschini MD PhD, Mauro Galeazzi MD PhD, Bruno Frediani MD PhD, Claudia Fabiani MD PhD, Gian Marco Tosi MD PhD and Luca Cantarini MD PhD

Background: Behçet’s disease (BD) is an inflammatory disorder potentially leading to life- and sight-threatening complications. No laboratory marker correlating with disease activity or predicting the occurrence of disease manifestations is currently available.

Objectives: To determine an association between serum amyloid-A (SAA) levels and disease activity via the BD Current Activity Form (BDCAF), to evaluate disease activity in relation to different SAA thresholds, to examine the association between single organ involvement and the overall major organ involvement with different SAA thresholds, and to assess the influence of biologic therapy on SAA levels.

Methods: We collected 95 serum samples from 64 BD patients. Related demographic, clinical, and therapeutic data were retrospectively gathered.

Results: No association was identified between SAA levels and BD disease activity (Spearman's rho = 0.085, P = 0.411). A significant difference was found in the mean BDCAF score between patients presenting with SAA levels < 200 mg/L and those with SAA levels > 200 mg/L (P = 0.027). SAA levels > 200 mg/L were associated with major organ involvement (P = 0.008). A significant association was found between SAA levels > 150 mg/dl and ocular (P = 0.008), skin (P = 0.002), and mucosal (P = 0.012) manifestations. Patients undergoing biologic therapies displayed more frequently SAA levels < 200 mg/L vs. patients who were not undergoing biologic therapies (P = 0.012).

Conclusions: Although SAA level does not represent a biomarker for disease activity, it might be a predictor of major organ involvement and ocular disease relapse at certain thresholds in patients with BD.

Gilad Allon MD, Nir Seider MD, Itzchak Beiran MD and Eytan Z. Blumenthal MD
July 2018
Tima Davidson, Michal M. Ravid, Ella Nissan, Mirriam Sklair-Levy, Johnatan Nissan and Bar Chikman

Background: When a breast lesion is suspected based on a physical exam, mammography, or ultrasound, a stereotactic core needle biopsy (CNB) is usually performed to help establish a definitive diagnosis. CNBs are far less invasive than excisional biopsies, with no need for general anesthetics or hospitalization, and no recovery period. However, since only samples of the mass are removed in a CNB and not the whole mass, sampling errors can occur.

Objectives: To compare the degree of agreement between the pathological data from CNBs and excisional biopsies from a single tertiary referral hospital.

Methods: The concordance of pathological data was compared in patients who underwent CNBs and had their surgical procedures at the same medical center.

Results: From the 894 patients who underwent CNBs, 254 (28.4%) underwent subsequent excisional biopsies at our medical center. From the total of 894 patients, 227 (25.3%) who underwent a CNB were diagnosed with a malignancy, with the rest of the CNBs being diagnosed as benign pathologies. The pathological findings in the CNBs and in the excisional biopsies concurred in 232/254 (91.3%) of the cases.

Conclusions: A CNB to confirm mammographic or clinical findings of breast lesions is an accurate method to establish a pathological diagnosis of breast lesions. The accuracy is higher for invasive carcinomas than for non-invasive cancers. Excisional biopsies are necessary for lesions with anticipated sampling errors or when the core needle biopsy findings are discordant with clinical or mammographic findings.

Yael Einbinder MD, Timna Agur MD, Kirill Davidov, Tali Zitman-Gal PhD, Eliezer Golan MD and Sydney Benchetrit MD

Background: Anemia management strategies among chronic hemodialysis patients with high ferritin levels remains challenging for nephrologists.

Objectives: To compare anemia management in stable hemodialysis patients with high (≥ 500 ng/ml) vs. low (< 500 ng/ml) ferritin levels

Methods: In a single center, record review, cohort study of stable hemodialysis patients who were followed for 24 months, an anemia management policy was amended to discontinue intravenous (IV) iron therapy for stable hemodialysis patients with hemoglobin > 10 g/dl and ferritin ≥ 500 ng/ml. Erythropoiesis-stimulating-agents (ESA), IV iron doses, and laboratory parameters were compared among patients with high vs. low baseline ferritin levels before and after IV iron cessation.

Results: Among 87 patients, 73.6% had baseline ferritin ≥ 500 ng/ml. Weekly ESA dose was greater among patients with high vs. low ferritin (6788.8 ± 4727.8 IU/week vs. 3305.0 ± 2953.9 IU/week, P = 0.001); whereas, cumulative and monthly IV iron doses were significantly lower (1628.2 ± 1491.1 mg vs. 2557.4 ± 1398.9 mg, P = 0.011, and 82.9 ± 85 vs. 140.7 ± 63.9 mg, P = 0.004). Among patients with high ferritin, IV iron was discontinued for more than 3 months in 41 patients (64%) and completely avoided in 6 (9.5%).ESA dose and hemoglobin levels did not change significantly during this period.

Conclusions: Iron cessation in chronic hemodialysis patients with high ferritin levels did not affect hemoglobin level or ESA dose and can be considered as a safe policy for attenuating the risk of chronic iron overload.

Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel