• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Sat, 30.11.24

Search results


July 2016
Marina Leitman MD, Eli Peleg MD, Ruthie Shmueli and Zvi Vered MD FACC FESC

Background: The search for the presence of vegetations in patients with suspected infective endocarditis is a major indication for trans-esophageal echocardiographic (TEE) examinations. Advances in harmonic imaging and ongoing improvement in modern echocardiographic systems allow adequate quality of diagnostic images in most patients.

Objectives: To investigate whether TEE examinations are always necessary for the assessment of patients with suspected infective endocarditis. 

Methods: During 2012–2014 230 trans-thoracic echo (TTE) exams in patients with suspected infective endocarditis were performed at our center. Demographic, epidemiological, clinical and echocardiographic data were collected and analyzed, and the final clinical diagnosis and outcome were determined. 

Results: Of 230 patients, 24 had definite infective endocarditis by clinical assessment. TEE examination was undertaken in 76 of the 230 patients based on the clinical decision of the attending physician. All TTE exams were classified as: (i) positive, i.e., vegetations present; (ii) clearly negative; or (iii) non-conclusive. Of the 92 with clearly negative TTE exams, 20 underwent TEE and all were negative. All clearly negative patients had native valves, adequate quality images, and in all 92 the final diagnosis was not infective endocarditis. Thus, the negative predictive value of a clearly negative TTE examination was 100%.

Conclusions: In patients with native cardiac valves referred for evaluation for infective endocarditis, an adequate quality TTE with clearly negative examination may be sufficient for the diagnosis.

 

Noa Lavi MD, Gali Shapira MD, Ariel Zilberlicht MD, Noam Benyamini MD, Dan Farbstein MD, Eldad J. Dann MD, Rachel Bar-Shalom MD and Irit Avivi MD

Background: Despite the lack of clinical studies supporting the use of routine surveillance FDG-positron emission tomography (PET) in patients with diffuse large B cell lymphoma (DLBCL) who achieved remission, many centers still use this strategy, especially in high risk patients. Surveillance FDG-PET computed tomography (CT) is associated with a high false positive (FP) rate in DLBCL patients. 

Objectives: To investigate whether use of specific CT measurements could improve the positive predictive value (PPV) of surveillance FDG-PET/CT. 

Methods: This retrospective study included DLBCL patients treated with CHOP or R-CHOP who achieved complete remission and had at least one positive surveillance PET. CT-derived features of PET-positive sites, including long and short diameters and presence of calcification and fatty hilum within lymph nodes, were assessed. Relapse was confirmed by biopsy or consecutive imaging. The FP rate and PPV of surveillance PET evaluated with/without CT-derived measurements were compared. 

Results: Seventy surveillance FDG-PET/CT scans performed in 53 patients were interpreted as positive for relapse. Of these studies 25 (36%) were defined as true-positive (TP) and 45 (64%) as FP. Multivariate analysis found long or short axis measuring ≥ 1.5 and ≥ 1.0 cm, respectively, in PET-positive sites, International Prognostic Index (IPI) ≥ 2, lack of prior rituximab therapy and FDG uptake in a previously involved site, to be independent predictors of true positive surveillance PET (odds ratio 5.4, 6.89, 6.6, 4.9, P < 0.05 for all). 

Conclusion: PPV of surveillance PET/CT may be improved by its use in selected high risk DLBCL patients and combined assessment of PET and CT findings.

 

Orit Erman MD, Arie Erman PhD, Alina Vodonos MPH, Uzi Gafter MD PhD and David J. van Dijk MD

Background: Proteinuria and albuminuria are markers of kidney injury and function, serving as a screening test as well as a means of assessing the degree of kidney injury and risk for cardiovascular disease and death in both the diabetic and the non-diabetic general population.

Objectives: To evaluate the association between proteinuria below 300 mg/24 hours and albuminuria, as well as a possible association with kidney function in patients with diabetes mellitus (DM).

Methods: The medical files of patients with type 1 and type 2 DM with proteinuria below 300 mg/24 hours at three different visits to the Diabetic Nephropathy Clinic were screened. This involved 245 patient files and 723 visits. The data collected included demographics; protein, albumin and creatinine levels in urine collections; blood biochemistry; and clinical and treatment data. 

Results: The association between proteinuria and albuminuria is non-linear. However, proteinuria in the range of 162–300 mg/24 hours was found to be linearly and significantly correlated to albuminuria (P < 0.001, r = 0.58). Proteinuria cutoff, based on albuminuria cutoff of 30 mg/24 hours, was found to be 160.5 mg/24 hr. Body mass index (BMI) was the sole independent predictor of proteinuria above 160.5 mg/24 hr. Changes in albuminuria, but not proteinuria, were associated with changes in creatinine clearance. 

Conclusions: A new cutoff value of 160.5 mg/hr was set empirically, for the first time, for abnormal proteinuria in diabetic patients. It appears that proteinuria below 300 mg/24 hr is not sufficient as a sole prognostic factor for kidney failure. 

 

Nour E. Yaghmour MD PhD, Zvi Israel MD, Hagai Bergman MD PhD, Renana Eitan MD and David Arkadir MD PhD
Yishay Wasserstrum MD, Pia Raanani MD, Ran Kornowski MD and Zaza Iakobishvili MD PhD
David Yardeni MD, Ori Galante MD, Lior Fuchs MD, Daniela Munteanu MD, Wilmosh Mermershtain MD, Ruthy Shaco-Levy MD and Yaniv Almog MD
Javier Marco-Hernández MD PhD, Sergio Prieto-González MD, Miquel Blasco MD, Pedro Castro MD PhD, Joan Cid MD PhD and Gerard Espinosa MD PhD
Hussein Sliman MD, Keren Zissman MD, Jacob Goldstein MD, Moshe Y. Flugelman MD and Yaron Hellman MD
June 2016
Doron Goldberg MD MHA, Avi Tsafrir MD, Naama Srebnik MD, Michael Gal MD PhD, Ehud J. Margalioth MD, Pnina Mor CNM PHD, Rivka Farkash MPH, Arnon Samueloff MD and Talia Eldar-Geva MD PhD

Background: Fertility treatments are responsible for the rise in high order pregnancies in recent decades and their associated complications. Reducing the number of embryos returned to the uterus will reduce the rate of high order pregnancies.     

Objectives: To explore whether obstetric history and parity have a role in the clinician’s decision making regarding the number of embryos transferred to the uterus during in vitro fertilization (IVF).

Methods: In a retrospective study for the period August 2005 to March 2012, data were collected from twin deliveries > 24 weeks, including parity, mode of conception (IVF vs. spontaneous), gestational age at delivery, preeclampsia, birth weight, admission to the neonatal intensive care unit (NICU), and Apgar scores. 

Results: A total of 1651 twin deliveries > 24 weeks were recorded, of which 959 (58%) were at term (> 37 weeks). The early preterm delivery (PTD) rate (< 32 weeks) was significantly lower with increased parity (12.6%, 8.5%, and 5.6%, in women with 0, 1, and ≥ 2 previous term deliveries, respectively). Risks for PTD (< 37 weeks), preeclampsia and NICU admission were significantly higher in primiparous women compared to those who had one or more previous term deliveries. Primiparity and preeclampsia, but not IVF, were significant risk factors for PTD. 

Conclusions: The risk for PTD in twin pregnancies is significantly lower in women who had a previous term delivery and decreases further after two or more previous term deliveries. This finding should be considered when deciding on the number of embryos to be transferred in IVF.  

 

Gustavo Goldenberg MD, Tamir Bental MD, Udi Kadmon MD, Ronit Zabarsky MD, Jairo Kusnick MD, Alon Barsheshet MD, Gregory Golovchiner MD and Boris Strasberg MD

Background: Syncope is a common clinical condition spanning from benign to life-threatening diseases. There is sparse information on the outcomes of syncopal patients who received an implantable cardiac defibrillator (ICD) for primary prevention of sudden cardiac death (SCD). 

Objectives: To assess the outcomes and prognosis of patients who underwent implantable cardiac defibrillator (ICD) implantation for primary prevention of SCD and compare them to patients who presented with or without prior syncope.

Methods: We compared the medical records of 75 patients who underwent ICD implantation for primary prevention of SCD and history of syncope to those of a similar group of 80 patients without prior syncope. We assessed the episodes of ventricular tachycardia (VT), ventricular fibrillation (VF), shock, anti-tachycardia pacing (ATP) and mortality in each group during follow-up.

Results: Mean follow-up was 893 days (810–976, 95%CI) (no difference between groups). There was no significant difference in gender or age. Patients with prior syncope had a higher ejection fraction rate (35.5 ± 12.6 vs. 31.4 ± 8.76, P = 0.02), experienced more episodes of VT (21.3% vs. 3.8%, P = 0.001) and VF (8% vs. 0%, P = 0.01), and received more electric shocks (18.7% vs. 3.8%, P = 0.004) and ATP (17.3% vs. 6.2%, P = 0.031). There were no differences in inappropriate shocks (6.7% vs. 5%, P = 0.74), cardiovascular mortality (cumulative 5 year estimate 29.9% vs. 32.2%, P = 0.97) and any death (cumulative 5 year estimate 38.1% vs. 48.9%, P = 0.18).

Conclusions: Patients presenting with syncope before ICD implantation seemed to have more episodes of VT/VF and shock or ATP. No differences in mortality were observed

 

Muhammad Mahajnah MD PhD, Rajech Sharkia PhD, Nadeem Shorbaji MSc and Nathanel Zelnik MD

Background: Despite the increased worldwide recognition of attention deficit/hyperactivity disorder (ADHD), there is a variability in the diagnostic rate of both ADHD and its co-morbidities. These diversities are probably related to the methodology and instruments used for the diagnosis of ADHD and to awareness and cultural interpretation of its existence. 

Objectives: To identify consistent differences in the clinical profile of Arab and Jewish children with ADHD in Israel who differ in their cultural, ethnic and socioeconomic background. 

Methods: We analyzed the data of 823 children and adolescents with ADHD (516 Jews and 307 Arabs) and compared the clinical characteristics between these two ethnic groups. All patients were evaluated in two neuropediatric and child development centers in northern Israel: one in Haifa and one in Hadera. Children with autism and intellectual disabilities were excluded. 

Results: The distribution of ADHD subtypes was similar in both populations. However, learning disorders and psychiatric co-morbidities (behavioral difficulties and anxiety) were reported more frequently in the Jewish population. The most commonly reported adverse effects to psychostimulants were mood changes, anorexia, headache, insomnia and rebound effect, and were more frequently reported in the Jewish population (42.0% vs.18.0%, P < 0.05).

Conclusions: We assume that these differences are related to cultural and socioeconomic factors. We suggest that the physician take cultural background into consideration when treating patients with ADHD.

 

Einat Hertzberg-Bigelman MsC, Rami Barashi MD, Ran Levy PhD, Lena Cohen MSc, Jeremy Ben-Shoshan MD PhD, Gad Keren MD and Michal Entin-Meer PhD

Background: Chronic kidney disease (CKD) is often accompanied by impairment of cardiac function that may lead to major cardiac events. Erythropoietin (EPO), a kidney-produced protein, was shown to be beneficial to heart function. It was suggested that reduced EPO secretion in CKD may play a role in the initiation of heart damage. 

Objectives: To investigate molecular changes in the EPO/erythropoietin receptor (EPO-R) axis in rat cardiomyocytes using a rat model for CKD.

Methods: We established a rat model for CKD by kidney resection. Cardiac tissue sections were stained with Masson’s trichrome to assess interstitial fibrosis indicating cardiac damage. To evaluate changes in the EPO/EPO-R signaling cascade in the myocardium we measured cardiac EPO and EPO-R as well as the phosphorylation levels of STAT-5, a downstream element in this cascade.

Results: At 11 weeks after resection, animals presented severe renal failure reflected by reduced creatinine clearance, elevated blood urea nitrogen and presence of anemia. Histological analysis revealed enhanced fibrosis in cardiac sections of CKD animals compared to the sham controls. Parallel to these changes, we found that although cardiac EPO levels were similar in both groups, the expression of EPO-R and the activated form of its downstream protein STAT-5 were significantly lower in CKD animals.

Conclusions: CKD results in molecular changes in the EPO/EPO-R axis. These changes may play a role in early cardiac damage observed in the cardiorenal syndrome.

 

Tzippora Shalem MD, Akiva Fradkin MD, Marguerite Dunitz-Scheer MD, Tal Sadeh-Kon Dsc RD, Tali Goz-Gulik MD, Yael Fishler MD and Batia Weiss MD

Background: Children dependent on gastrostomy tube feeding and those with extremely selective eating comprise the most challenging groups of early childhood eating disorders. We established, for the first time in Israel, a 3 week intensive weaning and treatment program for these patients based on the "Graz model."

Objectives: To investigate the Graz model for tube weaning and for treating severe selective eating disorders in one center in Israel. 

Methods: Pre-program assessment of patients’ suitability to participate was performed 3 months prior to the study, and a treatment goal was set for each patient. The program included a multidisciplinary outpatient or inpatient 3 week treatment course. The major outcome measures were achievement of the target goal of complete or partial tube weaning for those with tube dependency, and expansion of the child's nutritional diversity for those with selective eating. 

Results: Thirty-four children, 28 with tube dependency and 6 with selective eating, participated in four programs conducted over 24 months. Their mean age was 4.3 ± 0.37 years. Of all patients, 29 (85%) achieved the target goal (24 who were tube-dependent and 5 selective eaters). One patient was excluded due to aspiration pneumonia. After 6 months follow-up, 24 of 26 available patients (92%) maintained their target or improved. 

Conclusions: This intensive 3 week program was highly effective in weaning children with gastrostomy tube dependency and ameliorating severe selective eating. Preliminary evaluation of the family is necessary for completion of the program and achieving the child’s personal goal, as are an experienced multidisciplinary team and the appropriate hospital setup, i.e., inpatient or outpatient. 

 

Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel