• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Sat, 20.07.24

Search results


September 2014
August 2014
Reuben Baumal MD, Jochanan Benbassat MD and Julie A.D. Van
"Clinician-scientists" is an all-inclusive term for board-certified specialists who engage in patient care and laboratory-based (biomedical) research, patient-based (clinical) research, or population-based (epidemiological) research. In recent years, the number of medical graduates who choose to combine patient care and research has declined, generating concerns about the future of medical research. This paper reviews: a) the various current categories of clinician-scientists, b) the reasons proposed for the declining number of medical graduates who opt for a career as clinician-scientists, c) the various interventions aimed at reversing this trend, and d) the projections for the future role of clinician-scientists. Efforts to encourage students to combine patient care and research include providing financial and institutional support, and reducing the duration of the training of clinician-scientists. However, recent advances in clinical and biomedical knowledge have increased the difficulties in maintaining the dual role of care-providers and scientists. It was therefore suggested that rather than expecting clinician-scientists to compete with full-time clinicians in providing patient care, and with full-time investigators in performing research, clinician-scientists will increasingly assume the role of leading/coordinating interdisciplinary teams. Such teams would focus either on patient-based research or on the clinical, biomedical and epidemiological aspects of specific clinical disorders, such as hypertension and diabetes.
Noa Berar Yanay MD MHA, Lubov Scherbakov MD, David Sachs MD, Nana Peleg MD, Yakov Slovodkin MD and Regina Gershkovich MD

Background: Late nephrology referral, before initiation of dialysis treatment, is associated with adverse outcome.

Objectives: To investigate the implications of late nephrology referral on mortality among dialysis patients in Israel.

Methods: We retrospectively analyzed 200 incident dialysis patients. Patients were defined as late referrals if they started dialysis less than 3 months after their first nephrology consultation. Survival rates and risk factors for mortality were analyzed

Results: The early referral (ER) group comprised 118 patients (59%) and the late referral (LR) group 82 patients (41%). The mortality rate was 44.5% (53 patients) in the ER and 68% (n=56) in the LR group. The 4 year survival rate was 41.1% in the ER and 18.7% in the LR group (P < 0.0001). The mortality rate increased with late nephrology referral (HR 1.873, 95%CI 1.133–3.094), with age (HR 1.043 for each year, 95%CI 1.018–1.068), with diabetes (HR 2.399, CI 1.369-4.202), and with serum albumin level (HR 0.359 for an increase of each 1 g/dl, 95%CI 0.242–0.533). The median survival time was higher for the ER group in women, in patients younger than 70, and in diabetic patients. A trend for longer survival time was found in non-diabetic patients. Survival time was not increased in early referred patients older than 70 and in male patients.

Conclusions: Late nephrology referral is associated with an overall higher mortality rate in dialysis patients. The survival advantage of early referral may have a different significance in specific subgroups. The timing of nephrology referral should be considered as a modifiable risk factor for mortality in patients with end-stage renal disease. 

Daniel Elbirt MD*, Ilan Asher MD*, Keren Mahlab-Guri MD, Shira Bezalel-Rosenberg MD, Victor Edelstein MD and Zev Sthoeger MD

Background: Systemic lupus erythematosus (SLE) is an autoimmune disease characterized by disturbance of the innate and adaptive immune systems with the production of autoantibodies by stimulated B lymphocytes. The BLyS protein (B lymphocyte stimulator) is secreted mainly by monocytes and activated T cells and is responsible for the proliferation, maturation and survival of B cells.

Objectivs: To study sera BLyS level and its clinical significance in Israeli lupus patients over time.

Methods: The study population included 41 lupus patients (8 males, 33 females; mean age 35.56 ± 15.35 years) and 50 healthy controls. The patients were followed for 5.02 ± 1.95 years. We tested 221 lupus sera (mean 5.4 samples/patient) and 50 normal sera for BLyS levels by a capture ELISA. Disease activity was determined by the SLEDAI score.

Results: Sera BLyS levels were significantly higher in SLE patients than in controls (3.37 ± 3.73 vs. 0.32 ± 0.96 ng/ml, P < 0.05). BLyS levels were high in at least one sera sample in 80.5% of the patients but were normal in all sera in the control group. There was no correlation between sera BLyS and anti-ds-DNA autoantibody levels. BLyS levels fluctuated over time in sera of lupus patients with no significant correlation to disease activity.

Conclusions: Most of our lupus patients had high sera BLyS levels, suggesting a role for BLyS in the pathogenesis and course of SLE. Our results support the current novel approach of targeting BLyS (neutralization by antibodies or soluble receptors) in the treatment of active lupus patients.

Matti Eskelinen MD PhD, Tuomas Selander MSc, Pertti Lipponen MD PhD and Petri Juvonen MD PhD

Background: The primary diagnosis of functional dyspepsia (FD) is made on the basis of typical symptoms and by excluding organic gastrointestinal diseases that cause dyspeptic symptoms. However, there is difficulty reaching a diagnosis in FD.

Objectives: To assess the efficiency of the Usefulness Index (UI) test and history-taking in diagnosing FD.

Methods: A study on acute abdominal pain conducted by the World Organizati­on of Gast­roentero­logy Research Committee (OMGE) included 1333 patients presenting with acute abdo­minal pain. The clinical history-taking variables (n=23) for each pa­tient were recorded in detail using a prede­fined structured data collection sheet, and the collected data were compared with the final diagnoses.

Results: The most signifi­cant clinical history-taking variables of FD in univa­riate analysis were risk ratio (RR): location of pain at diagnosis (RR = 5.7), location of initial pain (RR = 6.5), previous similar pain (RR = 4.0), duration of pain (RR = 2.9), previous abdominal surgery (RR = 4.1), previous abdominal diseases (RR = 4.0), and previous indigestion (RR = 3.1). T­he sensi­tivity of the physicians’ initial de­cisi­on in detecting FD was 0.44, speci­fi­city 0.99 and effi­ciency 0.98; UI was 0.19 and RR 195.3. In the stepwise multivariate logistic regression analysis, the independent predictors of FD were the physicians’ initial decision (RR = 266.4), location of initial pain (RR = 3.4), duration of pain (RR = 3.1), previous abdominal surgery (RR = 3.7), previous indigestion (RR = 2.2) and vomiting (RR = 2.0).

Conclusions: The patients with upper abdominal pain initially and a previous history of abdominal surgery and indigestion tended to be at risk for FD. In these patients the UI test could help the clinician differentiate FD from other diagnoses of acute abdominal pain.

Moshe D. Fejgin MD, Tal Y. Shvit MD, Yael Gershtansky MSc and Tal Biron-Shental MD

Background: Removal of retained placental tissue postpartum and retained products of conception (RPOC) abortion is done by uterine curettage or hysteroscopy. Trauma to the endometrium from surgical procedures, primarily curettage, can cause intrauterine adhesions (Asherman's syndrome) and subsequent infertility. The incidence of malpractice claims relating to intrauterine adhesions is rising, justifying reevaluation of the optimal way of handling these complications. 

Objectives: To review malpractice claims regarding intrauterine adhesions, and to explore the clinical approach that might reduce those claims or improve their medical and legal outcomes.

Methods: We examined 42 Asherman's syndrome claims handled by MCI, the largest professional liability insurer in Israel. The clinical chart of each case was reviewed and analyzed by the event preceding the adhesion formations, timing and mode of diagnosis, and outcome. We also assessed whether the adverse outcome was caused by substandard care and it it could have been avoided by different clinical practice. The legal outcome was also evaluated.

Results: Forty-seven percent of the cases occurred following vaginal delivery, 19% followed cesarean section, 28% were RPOC following a first-trimester pregnancy termination, and 2% followed a second-trimester pregnancy termination.

Conclusions: It is apparent that due to a lack of an accepted management protocol for cases of RPOC, it is difficult to legally defend those cases when the complication of Asherman syndrome develops. 

July 2014
Michael Arad MD, Tamar Nussbaum MD, Ido Blechman BA, Micha S. Feinberg MD, Nira Koren-Morag PhD,Yael Peled MD and Dov Freimark MD

Background: Contemporary therapies improve prognosis and may restore left ventricular (LV) size and function.

Objectives: To examine the prevalence, clinical features and therapies associated with reverse remodeling (RR) in dilated cardiomyopathy (DCM).

Methods: The study group comprised 188 DCM patients who had undergone two echo examinations at least 6 months apart. RR was defined as increased LV ejection fraction (LVEF) by ≥ 10% concomitant with ≥ 10% decreased LV end-diastolic dimension.

Results: RR occurred in 50 patients (26%) and was associated with significantly reduced end-systolic dimension, left atrial size, grade of mitral regurgitation, and pulmonary artery pressure. NYHA class improved in the RR group. RR was less common in familial DCM and a long-standing disease and was more prevalent in patients with prior exposure to chemotherapy. Recent-onset disease, lower initial LVEF and normal electrocardiogram were identified as independent predictors of RR. Beta-blocker dose was related to improved LVEF but not to RR. Over a mean follow-up of 23 months, 16 patients (12%) from the 'no-RR' group died or underwent heart transplantation compared to none from the RR group (P < 0.01).

Conclusions: Contemporary therapies led to an an improvement in the condition of a considerable number of DCM patients. A period of close observation while optimizing medical therapy should be considered before deciding on invasive procedures. 

Arie Soroksky MD, Sergey Nagornov MD, Eliezer Klinowski MD, Yuval Leonov MD, Eduard Ilgiyaev MD, Orit Yossepowitch MD and Galina Goltsman M

Background: The role of routine active surveillance cultures (ASCs) in predicting consequent blood stream infections is unclear.

Objectives: To determine prospectively whether routine screening ASCs obtained on admission to the intensive care unit (ICU) can predict the causative agent of subsequent bloodstream infections.

Methods: We prospectively studied a cohort of 100 mechanically ventilated patients admitted consecutively to a 16-bed ICU. On admission, ASCs were obtained from four sites: skin cultures (swabs) from the axillary region, rectal swabs, nasal swabs, and deep tracheal aspirates. Thereafter, cultures were obtained from all four sites daily for the next 5 days of the ICU stay.

Results: Of the 100 recruited patients 31 (31%) had culture-proven bacteremia; the median time to development of bacteremia was 5 days (range 1–18). Patients with bacteremia had a longer median ICU stay than patients without bacteremia: 14 days (range 2–45) vs. 5 days (1–41) (P < 0.001). ICU and 28 day mortality were similar in patients with and without bacteremia. Most ASCs grew multiple organisms. However, there was no association between pathogens growing on ASCs and eventual development of bacteremia.

Conclusions: ASCs obtained on ICU admission did not identify the causative agents of most subsequent bacteremia events. Therefore, bloodstream infections could not be related to ASCs.

Boaz Amichai MD, Marcelo H. Grunwald MD, Batya Davidovici MD and Avner Shemer MD

Background: Tinea pedis is a common chronic skin disease; the role of contaminated clothes as a possible source of infection or re-infection has not been fully understood. The ability of ultraviolet light to inactivate microorganisms has long been known and UV is used in many applications.

Objectives: To evaluate the effectivity of sun exposure in reducing fungal contamination in used clothes.

Methods: Fifty-two contaminated socks proven by fungal culture from patients with tinea pedis were studied. The samples were divided into two groups: group A underwent sun exposure for 3 consecutive days, while group B remained indoors. At the end of each day fungal cultures of the samples were performed.

Results: Overall, there was an increase in the percentage of negative cultures with time. The change was significantly higher in socks that were left in the sun (chi-square for linear trend = 37.449, P < 0.0001).

* Louis Brandeis, Associate Justice of the U.S. Supreme Court, 1913

Conclusions: Sun exposure of contaminated clothes was effective in lowering the contamination rate. This finding enhances the current trends of energy saving and environmental protection, which recommend low temperature laundry.

Natalya Bilenko MD PhD MPH, Drora Fraser PhD, Hillel Vardy BA and Ilana Belmaker MD MPH
Background: A high prevalence of iron deficiency anemia persists in Bedouin Arab and Jewish pediatric populations in southern Israel.

Objectives: To compare the effect of daily use of multiple micronutrient supplementation (MMS), "Sprinkles," a powdered formulation of iron, vitamins A and C, folic acid and zinc, with liquid iron and vitamins A and D on iron deficiency at 12 months of age.

Methods: The 621 eligible Bedouin and Jewish infants in the study were assigned to the MMS and control arms and received their supplementations from age 6 to 12 months. We examined the change in hemoglobin, hematocrit, mean cell volume, red blood cell distribution, serum ferritin and transferrin saturation. In addition, we used the high Iron Deficiency Index (IDI) if two or more of the above six parameters showed abnormal levels. 

Results: Rates of anemia decreased significantly over the 6 month period, from 58.8% to 40.6% among Bedouin infants (P = 0.037) and from 40.6 to 15.8% among Jewish infants (P = 0.017). In Bedouin infants the prevalence of high IDI decreased significantly from 79.2% to 67.4% (P = 0.010) in the MMS group, but there was no change in the controls. Among Jewish infants, the high IDI prevalence decreased from 67% to 55.6% with no statistically significant difference in the two study arms. In the multivariate analysis in Bedouin infants MMS use was associated with a reduced risk of 67% in high IDI at age 12 months as compared to controls (P = 0.001). Fewer side effects in the intervention groups in both ethnic populations were reported.

Conclusions: MMS fortification of home food can be recommended as an effective and safe method for preventing iron deficiency anemia at 12 months of age. 
Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel