• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Fri, 27.12.24

Search results


February 2003
Y. Turgeman, S. Atar, K. Suleiman, A. Feldman, L. Bloch, N. A. Freedberg, D. Antonelli, M. Jabaren and T. Rosenfeld

Background: Current clinical guidelines restrict catheterization laboratory activity without on-site surgical backup. Recent improvements in technical equipment and pharmacologic adjunctive therapy increase the safety margins of diagnostic and therapeutic cardiac catheterization.

Objective: To analyze the reasons for urgent cardiac surgery and mortality in the different phases of our laboratory’s activity in the last 11 years, and examine the impact of the new interventional and therapeutic modalities on the current need for on-site cardiac surgical backup.

Methods: We retrospectively reviewed the mortality and need for urgent cardiac surgery (up to 12 hours post-catheterization) through five phases of our laboratory’s activity: a) diagnostic (years 1989–2000), b) valvuloplasties and other non-coronary interventions (1990–2000), c) percutaneous-only balloon angioplasty (1992–1994), d) coronary stenting (1994–2000), and e) use of IIb/IIIa antagonists and thienopiridine drugs (1996–2000).

Results: Forty-eight patients (0.45%) required urgent cardiac surgery during phase 1, of whom 40 (83%) had acute coronary syndromes with left main coronary artery stenosis or the equivalent, and 8 (17%) had mechanical complications of acute myocardial infarction. Two patients died (0.02%) during diagnostic procedures. In phase 2, eight patients (2.9%) were referred for urgent cardiac surgery due to either cardiac tamponade or severe mitral regurgitation, and two patients (0.7%) died. The combined need for urgent surgery and mortality was significantly lower in phase 4 plus 5 as compared to phase 3 (3% vs. 0.85%, P = 0.006).

Conclusion: In the current era using coronary stents and potent antithrombotic drugs, after gaining experience and crossing the learning curve limits, complex cardiac therapeutic interventions can safely be performed without on-site surgical backup.
 

January 2003
December 2002
Yehonatan Sharabi MD, Idit Reshef-Haran MS, Moshe Burstein MD and Arieh Eldad MD

Background: Some studies have indicated a possible link between cigarette smoking and hearing loss.

Objectives: To analyze the association between smoking and hearing loss, other than that induced by noise, and to characterize the type of HL impairment found in smokers.

Methods: We conducted a retrospective cross-sectional study in 13,308 men aged 20±68 (median 34.6 years) who underwent a hearing test as part of a routine periodic examination. For each subject, age, smoking status (current, past or non-smokers) and number of cigarettes per day were noted and a hearing test was performed. The test was performed in a sealed, soundproof room by an experienced audiologist and included pure tone audiometry of 250±8,000 Hz. The audiograms were analyzed and subjects were accordingly divided into two groups: those with HL and at least one of the following impairments in at least one ear: sensorineural, conductive or mixed; and those with no hearing loss (control). Audiograms showing HL typical to noise exposure were excluded.

Results: The prevalence of any type of HL among subjects <35 years was 4.5%, compared to 10.5% among those >35 years (P < 0.0001). A significantly higher incidence of any type of HL was found in current (11.8%) and past smokers (11.7%) than in non-smokers (8.1%) (P < 0.0001). The risk increment of the smoking status for developing HL among subjects under age 35 was 43%, and 17% among those above 35 years. Both mild, flat, sensorineural impairment and conductive impairment were found to be associated particularly with smoking (odds ratio 2.2 and 1.9, respectively).

Conclusions: The incidence of HL unrelated to noise exposure is higher in smokers than in non-smokers, and in young adults the effect is greater.
 

November 2002
Shifra Sela, PhD, Revital Shurtz-Swirski, PhD, Jamal Awad, MD, Galina Shapiro, MSc, Lubna Nasser, MSc, Shaul M. Shasha, MD and Batya Kristal, MD

Background: Cigarette smoking is a well-known risk factor for the development of endothelial dysfunction and the progression of atherosclerosis. Oxidative stress and inflammation have recently been implicated in endothelial dysfunction.

Objectives: To assess the concomitant contribution of polymorphonuclear leukocytes to systemic oxidative stress and inflammation in cigarette smokers.

Methods: The study group comprised 41 chronic cigarette-smoking, otherwise healthy males aged 45.0 ± 11.5 (range 31–67 years) and 41 male non-smokers aged 42.6 ± 11.3 (range 31–65) who served as the control group. The potential generation of oxidative stress was assessed by measuring the rate of superoxide release from separated, phorbol 12-myristate 13-acetate-stimulated PMNL[1] and by plasma levels of reduced (GSH) and oxidized (GSSG) glutathione. Inflammation was estimated indirectly by: a) determining the in vitro survival of PMNL, reflecting cell necrosis; b) in vivo peripheral PMNL counts, reflecting cell recruitment; and c) plasma alkaline phosphatase levels, indicating PMNL activation and degranulation.

Results: PMA[2]-stimulated PMNL from cigarette smokers released superoxide at a faster rate than PMNL from the controls. Smokers had decreased plasma GSH[3] and elevated GSSG[4] levels. In vitro incubation of control and smokers' PMNL in sera of smokers caused necrosis, while control sera improved smoker PMNL survival. Smokers' PMNL counts, although in the normal range, were significantly higher than those of controls. Plasma ALP[5] levels in smokers were significantly higher than in controls and correlated positively with superoxide release and PMNL counts.

Conclusions: Our study shows that PMNL in smokers are primed in vivo, contributing concomitantly to systemic oxidative stress and inflammation that predispose smokers to endothelial dysfunction, and explains in part the accelerated atherosclerosis found in smokers.

_______________________________________

[1] PMNL = polymorphonuclear leukocytes

[2] PMA = phorbol 12-myristate 13-acetate

[3] GSH = reduced glutathione

[4] GSSG = oxidized glutathione

[5] ALP = alkaline phosphatase

Gabriel S. Breuer, MD, David Raveh, MD, Bernard Rudensky, PhD, Raina Rosenberg, MD, Rose Ruchlemer, MD and Jonathan Halevy, MD
October 2002
Abraham Benshushan, MD, Avi Tsafrir, MD, Revital Arbel, MD, Galia Rahav, MD, Ilana Ariel, MD and Nathan Rojansky, MD

Background: Although Listeria monocytogenes is widely distributed in nature, it rarely causes clinical infection in previously healthy people. This microorganism. however, may cause severe invasive disease in pregnant women and newborns.

Objectives: To investigate – in our pregnant population – the impact, severity and outcome of listeriosis on both mother and fetus.

Method: The study was carried out at a level III, university two-hospital complex, In a retrospective chart review of 65,022 parturients during a 10 year period (1990-1999), we identified and: evaluated 11 pregnant patients and their offspring with Listeria infection;

Results: Chorioamnionitis with multiple. placental abscesses were observed in all five placentae examined. Clinically 4 of 11 parturients had a cesarean section for fetal distress (36.3%), as compared to the 14% mean CS rate in our general population. Two of 11 had a fate abortion (18.1%), as compared with the 4% rate in our hospital. Four of 11 had premature labor (36%), which was about four times the rate in our population. Finally, although no intrauterine feta1 death was recorded in our series, there was one neonatal death of a term infant. (1/11, 9%), which is about 10 times higher than our corrected perinatal mortality rate.

Conclusions: If not promptly and adequately treated, listeriosis in pregnancy may present serious hazards to the fetus and newborn through direct infection-of the placenta and chorioamnionitis.
 

Arie Bitterman, MD, Richard I. Bleicher, MD, Myles C. Cabot, PhD, Yong Y. Liu, MD, PhD and Armando E. Giuliano, MD
September 2002
Ronen Durst, MD, Deborah Rund, MD, Daniel Schurr, MD, Osnat Eliav, MSc, Dina Ben-Yehuda, MD, Shoshi Shpizen, BSc, Liat Ben-Avi, BSc, Tova Schaap, MSc, Inna Pelz, BSc and Eran Leitersdorf, MD

Background: Low density lipoprotein apheresis is used as a complementary method for treating hypercholesterolemic patients who cannot reach target LDL[1]-cholesterol levels on conventional dietary and drug treatment. The DALI system (direct absorption of lipoproteins) is the only extracorporeal LDL-removing system compatible with whole blood.

Objective: To describe our one year experience using the DALI[2] system.

Methods: LDL apheresis was used in 13 patients due to inability to reach target LDL-C levels on conventional treatment. They included seven patients with familial hypercholesterolemia, three who had adverse reactions to statins, and three patients with ischemic heart disease who did not reach LDL-C target level on medical treatment.

Results: The average triglyceride, total cholesterol, high density lipoprotein-C and LDL-C levels before and after treatment in all patients were: 170 ± 113 vs. 124 ± 91, 269 ± 74 vs. 132 ± 48, 42 ± 8 vs. 37 ± 7.9, and 196 ± 77 vs. 80 ± 52 mg/dl, respectively. Comparing the results of a subgroup of seven patients who had previously been treated with plasma exchange, it is noteworthy that while the reduction in triglyceride, total cholesterol and LDL-C are comparable, the effect on HDL[3]-C concentration was less apparent: from an average of 39.7 ± 8.7 and 23 ± 5.7 mg/dl before and after plasma exchange to an average of 43.9 ± 8.1 and 38.4 ± 7 mg/dl before and after LDL apheresis, respectively. Five patients developed treatment-related adverse events: three experienced allergic reactions manifested as shortness of breath, urticaria and facial flushing; one patient developed rhabdomyolysis, an adverse reaction that was not reported previously as a result of LDL apheresis; and one patient had myopathy with back pain. All untoward effects occurred during the first few treatment sessions.

Conclusions: LDL apheresis using the DALI system is highly efficacious for the treatment of hypercholesterolemia. It is associated with a significant number of side effects occurring during the first treatment sessions. In patients not experiencing adverse effects in the early treatment period, it is well tolerated, and can provide remarkable clinical benefit even after short-term therapy.

________________


[1] LDL = low density lipoprotein

[2] DALI = direct absorption of lipoproteins

[3] HDL = high density lipoprotein

July 2002
Manfred S. Green, MD, PhD and Zalman Kaufman, MSc

The appearance of “new” infectious diseases, the reemergence of “old” infectious diseases, and the deliberate introduction of infectious diseases through bioterrorism has highlighted the need for improved and innovative infectious disease surveillance systems. A review of publications reveals that traditional current surveillance systems are generally based on the recognition of a clear increase in diagnosed cases before an outbreak can be identified. For early detection of bioterrorist-initiated outbreaks, the sensitivity and timeliness of the systems need to be improved. Systems based on syndromic surveillance are being developed using technologies such as electronic reporting and the internet. The reporting sources include community physicians, public health laboratories, emergency rooms, intensive care units, district health offices, and hospital admission and discharge systems. The acid test of any system will be the ability to provide analyses and interpretations of the data that will serve the goals of the system. Such analytical methods are still in the early stages of development.

Amir Vardi, MD, Inbal Levin, RN, Haim Berkenstadt, MD, Ariel Hourvitz, MD, Arik Eisenkraft, MD, Amir Cohen, MD and Amital Ziv, MD

With chemical warfare becoming an imminent threat, medical systems need to be prepared to treat the resultant mass casualties. Medical preparedness should not be limited to the triage and logistics of mass casualties and first-line treatment, but should include knowledge and training covering the whole medical spectrum. In view of the unique characteristics of chemical warfare casualties the use of simulation-assisted medical training is highly appropriate. Our objective was to explore the potential of simulator-based teaching to train medical teams in the treatment of chemical warfare casualties. The training concept integrates several types of skill-training simulators, including high tech and low tech simulators as well as standardized simulated patients in a specialized simulated setting. The combined use of multi-simulation modalities makes this maverick program an excellent solution for the challenge of multidisciplinary training in the face of the looming chemical warfare threat.

Ronen Rubinshtein, MD, Eyal Robenshtok, MD, Arik Eisenkraft, MD, Aviv Vidan, MD and Ariel Hourvitz, MD

Recent events have significantly increased concern about the use of biologic and chemical weapons by terrorists and other countries. Since weapons of mass destruction could result in a huge number of casualties, optimizing our diagnostic and therapeutic skills may help to minimize the morbidity and mortality. The national demands for training in medical aspects of nuclear, biologic and chemical warfare have increased dramatically. While Israeli medical preparedness for non-conventional warfare has improved substantially in recent years especially due to extensive training programs, a standardized course and course materials were not available until recently. We have developed a core curriculum and teaching materials for a 1 or 2 day modular course, including printed materials.

Jacob T. Cohen, MD, Gil Ziv, MD, PhD, Joseph Bloom, MD, Daniel Zikk, MD, Yoram Rapoport, MD and Mordechai Z. Himmelfarb, MD

Background: The ear is the most frequent organ affected during an explosion. Recognition of possible damage to its auditory and vestibular components, and particularly the recovery time of the incurred damage, may help in planning the optimal treatment strategies for the otologic manifestations of blast injury and preventing deleterious consequences.   

Objective: To report the results of the oto-vestibular initial evaluation and follow-up of 17 survivors of a suicidal terrorist attack on a municipal bus.

Methods: These 17 patients underwent periodic ear inspections and pure tone audiometry for 6 months. Balance studies, consisting of electronystagmography (ENG) and computerized dynamic posturography (CDP) were performed at the first time possible.

Results: Complaints of earache, aural fullness and tinnitus resolved, whereas dizziness persisted in most of the patients. By the end of the follow-up, 15 (55.6%) of the eardrum perforations had healed spontaneously. Hearing impairment was detected in 33 of the 34 tested ears. Recovery of hearing was complete in 6 ears and partial in another 11. ENG and CDP were performed in 13 patients: 5 had abnormal results on CDP while the ENG was normal in all the patients. The vertigo in seven patients resolved in only one patient who was free of symptoms 1 month after the explosion.

Conclusion:  Exposure to a high powered explosion in a confined space may result in severe auditory and vestibular damage. Awareness of these possible ear injuries may prevent many of the deleterious consequences of such injuries.
 

June 2002
Eyal Leibovitz, MD, Dror Harats, MD and Dov Gavish, MD

Background: Hyperlipidemia is a major risk factor for coronary heart disease. Reducing low density lipoprotein-cholesterol can significantly reduce the risk of CHD[1], but many patients fail to reach the target LDL-C[2] goals due to low doses of statins or low compliance.

Objectives: To treat high risk patients with atorvastatin in order to reach LDL-C goals (either primary or secondary prevention) of the Israel Atherosclerosis Society.

Methods: In this open-label study of 3,276 patients (1,698 of whom were males, 52%), atorvastatin 10 mg was given as a first dose, with follow-up and adjustment of the dose every 6 weeks. While 1,670 patients did not receive prior hypolipidemic treatment, 1,606 were treated with other statins, fibrates or the combination of both.

Results: After 6 weeks of treatment, 70% of the patients who did not receive prior hypolipidemic medications and who needed primary prevention reached target LDL-C levels. Interestingly, a similar number of patients on prior hypolipidemic treatment reached the LDL-C goals for primary prevention. The patients treated with other statins, fibrates or both did not reach the LDL-C treatment goals. Only 34% of all patients who needed secondary prevention reached the ISA[3] LDL-C target of 100 mg/dl. Atorvastatin proved to be completely safe; only two patients had creatine kinase elevation above 500 U/L, and another six had mild CK[4] elevation (<500 U/L). None of the patients had clinical myopathy, and only one had to be withdrawn from the study.

Conclusion: Atorvastatin is a safe and effective drug that enables most patients requiring primary prevention to reach LDL-C goal levels, even with a low dose of 10 mg. Patients in need of secondary prevention usually require higher doses of statins.

__________________________________


[1] CHD = coronary heart disease


[2] LDL-C = low density lipoprotein-cholesterol


[3] ISA = Israel Atherosclerosis Society


[4] CK = creatine kinase




Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel