• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Thu, 18.07.24

Search results


May 2007
A. Ornoy

Seroconversion to cytomegalovirus occurs in 1%-4% of pregnant women, most of whom are seropositive prior to pregnancy. In 0.2%-2.5% of their newborn infants there is evidence of intrauterine infection; most are born without any clinical findings The typical clinical symptoms of symptomatic congenital CMV[1] are observed in 10-20% of infected neonates. They include intrauterine growth restriction, microcephaly, hepatosplenomegaly, petechiae, jaundice, thrombocytopenia, anemia, chorioretinitis, hearing loss and/or other findings. Long-term neurodevelopmental sequelae include mental retardation, motor impairment, sensorineural hearing loss and/or visual impairment. These may occur even in infants who are free of symptoms at birth. Most infants born with severe neonatal symptoms of congenital CMV are born to mothers with primary infection during pregnancy. However, since about half of the infants infected with CMV in utero, including those with severe neonatal symptoms, are born to mothers with preconceptional immunity, we have to conclude that congenital CMV may be a significant problem even in children born to mothers with pre-pregnancy immunization. This may justify the use of invasive methods for the detection of possible fetal infection even in cases of non-primary CMV infection. This should also be a consideration when deciding upon population screening or immunization for CMV.






[1]CMV = cytomegalovirus


April 2007
A. Keren, M. Poteckin, B. Mazouz, A. Medina, S. Banai, A. Chenzbraun, Z. Khoury and G. Levin

Background: Left ventricular outflow gradient is associated with increased morbidity and mortality in hypertrophic cardiomyopathy. Alcohol septal ablation is the alternative to surgery in cases refractory to drug therapy. The implication of LVOG[1] measured 1 week post-ASA[2] for prediction of outcome is unknown.

Objective: To observe the pattern of LVOG course and prediction of long-term clinical and hemodynamic outcome of ASA.

Methods: Baseline clinical and echocardiographic parameters were prospectively recorded in 14 consecutive patients with a first ASA, at the time of ASA, 3 and 7 days after ASA (in-hospital) and 3 and 12 months after ASA (last follow-up).

Results: There was improvement in NYHA class, exercise parameters and LVOG in 11 of 14 patients (P < 0.005 in all). Maximal creatine kinase level was lower than 500 U/L in those without such improvement and 850 U/L or higher in successful cases. LVOG dropped from 79 ± 30 to 19 ± 6 mmHg after the ASA. LVOG was 50 ± 21 mmHg on day 3, 39 ± 26 on day 7, 32 ± 26 at 3 months and 24 ± 20 mmHg at last follow-up. LVOG identified 27% sustained procedural successes on day 3 and 73% on day 7. The overall predictive accuracy of the test for sustained success and failure was 36% on day 3 and 71% on day 7. Combination of maximal CK[3] and LVOG on day 7 showed four distinct outcome patterns: "early success" with low LVOG and high CK (73% of successful cases), "late success" with high LVOG and high CK, and "early failure" and "late failure" with both low CK and high or low LVOG, respectively
Conclusion: LVOG measurement 7 days post-ASA combined with maximal CK levels predicts late procedural outcome in the majority of patients







[1] LVOG = left ventricular outflow gradient



[2] ASA = alcohol septal ablation



[3] CK = creatine kinase


January 2007
Z. Kaufman, W-K. Wong, T. Peled-Leviatan, E. Cohen, C. Lavy, G. Aharonowitz, R. Dichtiar, M. Bromberg, O. Havkin, E. Kokia and M.S. Green

Background: Syndromic surveillance systems have been developed for early detection of bioterrorist attacks, but few validation studies exist for these systems and their efficacy has been questioned.

Objectives: To assess the capabilities of a syndromic surveillance system based on community clinics in conjunction with the WSARE[1] algorithm in identifying early signals of a localized unusual influenza outbreak.

Methods: This retrospective study used data on a documented influenza B outbreak in an elementary school in central Israel. The WSARE algorithm for anomalous pattern detection was applied to individual records of daily patient visits to clinics of one of the four health management organizations in the country.

Results: Two successive significant anomalies were detected in the HMO’s[2] data set that could signal the influenza outbreak. If data were available for analysis in real time, the first anomaly could be detected on day 3 of the outbreak, 1 day after the school principal reported the outbreak to the public health authorities.

Conclusions: Early detection is difficult in this type of fast-developing institutionalized outbreak. However, the information derived from WSARE could help define the outbreak in terms of time, place and the population at risk.






[1] WSARE = What’s Strange About Recent Events



[2] HMO = health management organization


I. Hekselman, N.R. Kahan, M. Ellis, E. Kahan

Background: Ethnicity has been associated with variance in warfarin treatment regimens in various settings.

Objectives: To determine whether ethnicity is associated with variance in patient management in Israel.

Methods: Data were extracted from the electronic patient records of Clalit Health Services clinics in the Sharon Shomron region. The study group comprised all patients treated with warfarin who performed international normalized ratio tests for at least 6 months in 2003. The proportion of tests of each patient within the target range was calculated, as was the crude average rates and 95% confidence intervals for Jewish and Arab patients. The data were then stratified by patient's gender, specialty of attending physician, patient's age, and the country where the physician studied medicine.

Results: We identified 2749 Jews and 293 Arabs who met the inclusion criteria of the study. The crude average rate of patients’ INR[1] tests within the target range was 62.3% among Jews (95% CI[2] 61.5–63.1) and 52.7% (95% CI 49.9–55.5) among Arabs. When stratified by gender, age, and the treating physician's specialty and country of education, the stratum-specific rates among Jewish patients were consistently higher than among Arabs.

Conclusions: These results suggest that cultural differences regarding adherence to recommendations for drug therapy in addition to genetic factors may be associated with this variance.






[1] INR = international normalized ratio



[2] CI = confidence interval


E. Segal, C. Zinman, B. Raz and S. Ish-Shalom.

Background: Hip fracture rates are increasing worldwide, and the risk for a second hip fracture is high. The decision to administer antiresorptive treatment is based mainly on bone mineral density and/or a history of previous osteoporotic fractures.

Objectives: To evaluate the contribution of BMD[1], previous fractures, clinical and laboratory parameters to hip fracture risk assessment.

Methods: The study population included 113 consecutive hip fracture patients, aged 72.5 ± 9.4 years, discharged from the Department of Orthopedic Surgery113 consecutive patients, 87 women and 26 men, aged 50-90 years, mean ag. BMD was assessed at the lumbar spine, femoral neck and total hip. The results were expressed in standard deviation scores as T-scores – compared to young adults and Z-scores – compared to age-matched controls. Plasma or serum levels of parathyroid hormone, 25-hydroxyvitamin 3 and urinary deoxypyridinoline cross-links were evaluated.

Results: We observed T-scores ≤-2.5 in 43 patients (45.3%) at the lumbar spine, in 47 (52.2%) at the femoral neck and in 33 (38%) at the total hip. Twenty-eight patients (29.5%) had neither low BMD nor previous osteoporotic fractures. Using a T-score cutoff point of (-1.5) at any measurement site would put 25 (89%) of these patients into the high fracture risk group. Mean DPD level was 15.9 ± 5.8 ng/mg (normal 4–7.3 ng/mg creatinine). Vitamin D inadequacy was observed in 99% of patients.

Conclusions: Using current criteria, about one-third of elderly hip fracture patients might not have been diagnosed as being at risk. Lowering the BMD cutoff point for patients with additional risk factors may improve risk prediction yield.






[1] BMD = bone mineral density



 
December 2006
A. Kolomansky, R. Hoffman, G. Sarig, B. Brenner and N. Haim
 Background: Little is known about the epidemiology of venous thromboembolism in hospitalized patients in Israel. Also, a direct comparison of the clinical and laboratory features between cancer and non-cancer patients has not yet been reported.

Objectives: To investigate and compare the epidemiologic, clinical and laboratory characteristics of cancer and non-cancer patients hospitalized with venous thromboembolism in a large referral medical center in Israel.

Methods: Between February 2002 and February 2003, patients diagnosed at the Rambam Medical Center as suffering from VTE[1] (deep vein thrombosis and/or pulmonary embolism), based on diagnostic findings on Doppler ultrasonography, spiral computed tomography scan or lung scan showing high probability for pulmonary embolism, were prospectively identified and evaluated. In addition, at the conclusion of the study period, the reports of spiral chest CT scans, performed during the aforementioned period in this hospital, were retrospectively reviewed to minimize the number of unidentified cases. Blood samples were drawn for evaluation of the coagulation profile.

Results: Altogether, 147 patients were identified and 153 VTE events diagnosed, accounting for 0.25% of all hospitalizations during the study period. The cancer group included 63 patients (43%), most of whom had advanced disease (63%). The most common malignancies were cancer of the lung (16%), breast (14%), colon (11%) and pancreas (10%). Of 121 venous thromboembolic events (with or without pulmonary embolism) there were 14 upper extremity thromboses (12%). The most common risk factors for VTE, except malignancy, were immobilization (33%), surgery/trauma (20%) and congestive heart failure (17%). There was no difference in prevalence of various risk factors between cancer and non-cancer patients. During an acute VTE event, D-dimer levels were higher in cancer patients than non-cancer patients (4.04 ± 4.27 vs. 2.58 ± 1.83 mg/L respectively, P = 0.0550). Relatively low values of activated protein C sensitivity ratio and normalized protein C activation time were observed in both cancer and non-cancer groups (2.05 ± 0.23 vs. 2.01 ± 0.33 and 0.75 ± 0.17vs. 0.71 ± 0.22, respectively). These values did not differ significantly between the groups.

Conclusion: The proportion of cancer patients among patients suffering from VTE was high. Their demographic, clinical and laboratory characteristics (during an acute event) were not different from those of non-cancer patients, except for higher D-dimer levels.


 





[1] VTE = venous thromboembolism


E. Jaul, P. Singer and R. Calderon-Margalit
 Background: Despite the ongoing debate on tube feeding among severely demented patients, the current approach in western countries is to avoid feeding by tube.

Objectives: To assess the clinical course and outcome of demented elderly patients with severe disabilities by feeding mode.

Methods: The study was conducted in a skilled nursing department of a major psychogeriatric hospital in Israel. Eighty-eight patients aged 79 ± 9 years were followed for 17 months: 62 were fed by nasogastric tube and 26 were orally fed. The groups were compared for background characteristics, underlying medical condition, functional impairment, clinical and nutritional outcomes, and survival.

Results: Tube feeding had no beneficial effect on clinical and nutritional outcomes or on healing preexisting pressure ulcers, compared with oral feeding. Very few patients on tube feeding showed signs of discomfort, partly because of low cognitive function. Survival was significantly higher in the tube-fed patients (P < 0.001), which could be partly explained by the different case mix (i.e., the underlying diseases)

Conclusions: Tube feeding seems to have no nutritional advantage in severely demented elderly patients. Median survival was longer in tube-fed individuals who had no acute co-morbidity. However, since tube feeding does not add to patient pain and discomfort, it should not be contraindicated when it complies with the values and wishes of patients and their families.

November 2006
R. Segal, A. Furmanov and F. Umansky
 Background: The recent occurrence of a spontaneous intracerebral hemorrhage in Israel’s Prime Minister placed the scrutiny of local and international media on neurosurgeons as they made therapeutic decisions. In the ensuing public debate, it was suggested that extraordinary measures (surgical treatment) were undertaken only because of the celebrity of the patient.

Objectives: To evaluate the criteria used to select surgical versus medical management for SICH.

Methods: We retrospectively reviewed the files of 149 consecutive patients admitted with SICH[1] from January 2004 through January 2006 to our medical center. Their mean age was 66 (range 3–92 years), and 62% were male. SICH localization was lobar in 50% of patients, thalamus in 23%, basal ganglia in 15%, cerebellum in 13%, intraventricular in 6%, and pontine in 1%. Mean admission Glasgow Coma Score was 9 (range 3–15). Risk factors included hypertension in (74%), diabetes mellitus (34%), smoking (14%) and amyloid angiopathy (4%). Fifty percent of patients were on anticoagulant/antiplatelet therapy, including enoxaparin (3%), warfarin (7%), warfarin and aspirin (9%), or aspirin alone (34%).      

Results: Craniotomy was performed in 30% of patients, and ventriculostomy alone in 3%. Rebleed occurred in 9% of patients. Six months after the treatment 36% of operated patients were independent, 42% dependent, and 13% had died. At 6 months, 37% of non-operated patients were independent, 15% dependent, and 47% had died.

Conclusions: One-third of the SICH patients, notably those who were experiencing ongoing neurologic deterioration and had accessible hemorrhage, underwent craniotomy. The results are good, considering the inherent mortality and morbidity of SICH.


 





[1] SICH = spontaneous intracerebral hemorrhage


B-Z. Krimchansky, T. Galperin and Z. Groswasser
 This review briefly describes the clinical nature of the vegetative state, the commonly used clinical tests, the pathophysiology of the brain damage that is at the basis of the clinical picture, the customary pharmacological treatment of VS[1], the medical complications that are characteristic of this group of patients and the life expectancy of patients in a vegetative state.







[1] VS = vegetative state


October 2006
R. Segal, E. Lubart, A. Leibovitz, A. Iaina and D. Caspi
 Background: Aspirin is commonly used by elderly patients. In previous studies we found transient changes in renal function induced by low doses of aspirin.

Objectives: To investigate the mechanisms of these effects.

Methods: The study group included 106 long-term care stable geriatric inpatients. Diet and drugs were kept stable. The study lasted 5 weeks; during the first 2 weeks 100 mg aspirin was administered once a day. Clinical and laboratory follow-up were performed at baseline and weekly for the next 3 weeks. The glomerular filtration rate was estimated by creatinine clearance measured in 24 hour urine and serum creatinine, and by the Cockcroft-Gault formula (C-G) equation. Uric acid clearance (Cu acid) was determined from serum concentrations and 24 hour excretion of uric acid. Patients with serum creatinine > 1.5 mg/dl were not included.

Results: After 2 weeks on low dose aspirin, measured creatinine and uric acid clearances decreased significantly compared with the initial values in 70% and 62% of the patients, respectively, with mean decreases of 19% and 17%, respectively (P < 0.001). Blood urea nitrogen increased by 17% while serum creatinine and uric acid concentrations increased by 4% (P < 0.05 for all). The C-G[1] values decreased by 3% (P < 0.05). After withdrawal of aspirin all parameters improved. However, 67% of the patients remained with some impairment in their measured Ccr[2], compared to baseline. Patients who reacted adversely to low dose aspirin had significantly better pre-study renal function (Ccr), lower hemoglobin and lower levels of serum albumin.

Conclusions: Short-term low dose aspirin affected renal tubular creatinine and uric acid transport in the elderly, which may result in a prolonged or permanent deterioration of the renal function. It is suggested that renal functions be monitored even with the use of low dose aspirin in elderly patients.


 





[1] C-G = Cockcroft-Gault formula

[2] Ccr = creatinine clearance


Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel