• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Mon, 25.11.24

Search results


July 2013
D. Merims, H. Nahari, G. Ben-Ari, S. Jamal, C. Vigder and Y. Ben-Israel
 Background: Wandering is a common phenomenon among patients with dementia. While traditionally considered to be a behavioral problem, it also includes fundamental aspects of motor performance (e.g., gait and falls).

Objectives: To examine the difference in motor function and behavioral symptoms between patients with severe dementia who wander and those who do not.

Methods: We conducted a retrospective study reviewing the medical records of 72 patients with severe dementia, all residents of a dementia special care unit. Motor and behavioral aspects were compared between "wanderers" and “non-wanderers.”

Results: No difference was found in motor performance including the occurrence of falls between the wanderers and non-wanderers. A significant difference was found in aggressiveness and sleep disturbances, which were more frequent among the wanderers. There was no preference to wandering at a certain period of the day among the patients with sleep disturbances who wander.

Conclusions: In a protected environment wandering is not a risk factor for falls. Sleep disturbances and wandering co-occur, but there is no circumstantial association between the two symptoms.

June 2013
G. Barkai, A. Barzilai, E. Mendelson, M. Tepperberg-Oikawa, D. Ari-Even Roth and J. Kuint
 Background: Congenital cytomegalovirus (C-CMV) infection affects 0.4–2% of newborn infants in Israel, most of whom are asymptomatic. Of these, 10–20% will subsequently develop hearing impairment and might have benefitted from early detection by neonatal screening.

Objectives: To retrospectively analyze the results of a screening program for C-CMV performed at the Sheba Medical Center, Tel Hashomer, during a 1 year period, using real-time polymerase chain reaction (rt-PCR) from umbilical cord blood.

Methods: CMV DNA was detected by rt-PCR performed on infants’ cord blood. C-CMV was confirmed by urine culture (Shell-vial). All confirmed cases were further investigated for C-CMV manifestations by head ultrasound, complete blood count, liver enzyme measurement, ophthalmology examination and hearing investigation.

Results: During the period 1 June 2009 to 31 May 2010, 11,022 infants were born at the Sheba Medical Center, of whom 8105 (74%) were screened. Twenty-three (0.28%) were positive for CMV and 22 of them (96%) were confirmed by urine culture. Two additional infants, who had not been screened, were detected after clinical suspicion. All 24 infants were further investigated, and 3 (12.5%) had central nervous system involvement (including hearing impairment) and were offered intravenous ganciclovir for 6 weeks. Eighteen (82%) infants would not otherwise have been diagnosed.

Conclusions: The relatively low incidence of C-CMV detected in our screening program probably reflects the low sensitivity of cord blood screening. Nevertheless, this screening program reliably detected a non-negligible number of infants who could benefit from early detection. Other screening methods using saliva should be investigated further.

 

E.D. Amster, S.S. Fertig, U. Baharal, S. Linn, M.S. Green, Z. Lencovsky and R.S. Carel
 Background: From 2 to 5 December 2010, Israel experienced the most severe forest fire in its history, resulting in the deaths of 44 rescue workers. Little research exists on the health risks to emergency responders during forest fires, and there is no published research to date on occupational health among firefighters in Israel.

Objectives: To describe the exposures experienced by emergency responders to smoke, fire retardants and stress; the utilization of protective equipment; and the frequency of corresponding symptoms during and following the Carmel Forest fire.

Methods: A cohort of 204 firefighters and 68 police who took part in rescue and fire-abating activities during the Carmel Forest fire were recruited from a representative sample of participating stations throughout the country and interviewed regarding their activities during the fire and their coinciding symptoms. Unpaired two-sample t-test compared mean exposures and symptom frequency for firefighters and police. Chi-square estimates of OR and 95% CI are provided for odds of reporting symptoms, incurring injury or being hospitalized for various risk factors.

Results: Of the study participants, 87% reported having at least one symptom during rescue work at the Carmel Forest fire, with eye irritation (77%) and fatigue (71%) being the most common. Occupational stress was extremely high during the fire; the average length of time working without rest was 18.4 hours among firefighters.

Conclusions: Firefighters and police were exposed to smoke and occupational stress for prolonged periods during the fire. Further research is needed on the residual health effects from exposure to forest fires among emergency responders, and to identify areas for improvement in health preparedness.  

March 2013
A.M. Madsen, R. Pope, A. Samuels and C.Z. Margolis
 Background: Due to the war in Gaza in 2009, Ben-Gurion University’s Medical School for International Health with a student body of 165 international multicultural students canceled a week of classes. Third-year students continued clerkships voluntarily and fourth-year students returned to Israel before departing for clerkship in a developing country. A debriefing session was held for the entire school.

Objectives: To assess the academic and psychological effects of political conflict on students.

Methods: We asked all students to fill out an anonymous Google electronic survey describing their experience during the war and evaluating the debriefing. A team of students and administrators reviewed the responses.

Results: Sixty-six students (40% of the school) responded (first year 26%, second year 39%, third year 24%, fourth year 8%, taking time off 3%, age 23–40 years old). Eighty-three percent were in Israel for some portion of the war and 34% attended the debriefing. Factors that influenced individuals’ decision to return/stay in the war zone were primarily of an academic and financial nature. Other factors included family pressure, information from peers and information from the administration. Many reported psychological difficulties during the war rather than physical danger, describing it as “draining” and that it was difficult to concentrate while studying. As foreigners, many felt their role was undefined. Although there is wide variation in the war’s effect on daily activities and emotional well-being during that time, the majority (73%) reported minimal residual effects.

Conclusions: This study lends insight to the way students cope during conflict and highlights academic issues during a war. Open and frequent communication and emphasis on the school as a community were most important to students.

 

December 2012
Y. Shacham, E.Y. Birati, O. Rogovski, Y. Cogan, G. Keren and A. Roth

Background: The 20%–60% rate of acute anterior myocardial infarction (AAMI) patients with concomitant left ventricular thrombus (LVT) formation dropped to 10–20% when thrombolysis and primary percutaneous coronary intervention (PPCI) were introduced.

Objective: To test our hypothesis that prolonged anticoagulation post-PPCI will lower the LVT incidence even further.

Methods: Included in this study were all 296 inpatients with ST elevation AAMI who were treated with PPCI (from January 2006 to December 2009). Treatment included heparin anticoagulation (48 hours) followed by adjusted doses of low molecular weight heparin (3 more days). All patients underwent cardiac echocardiography on admission and at discharge. LVT and bleeding complications were reviewed and compared.

Results: LVT formation was present on the first echocardiogram in 6/296 patients. Another 8/289 patients displayed LVT only on their second echocardiogram (4.7%, 14/296). LVT patients had significantly lower LV ejection fractions than non-LVT patients at admission (P < 0.003) and at discharge (P < 0.001), and longer time to reperfusion (P = 0.168). All patients were epidemiologically and clinically similar. There were 6 bleeding episodes that required blood transfusion and 11 episodes of minor bleeding.

Conclusions: Five days of continuous anticoagulation therapy post-PPCI in inpatients with AAMI is associated with low LVT occurrence without remarkably increasing bleeding events.
 

O. Dolkart, W. Khoury, S. Avital, R. Flaishon and A.A Weinbroum

Background: Carbon dioxide is the most widely used gas to establish pneumoperitoneum during laparoscopic surgery. Gastrointestinal trauma may occur during the peritoneal insufflation or during the operative phase itself. Early diagnosis of these injuries is critical.

Objectives: To assess changes in end-tidal carbon dioxide (ETCO2) following gastric perforation during pneumoperitoneum in the rat.

Methods: Wistar rats were anesthetized, tracheotomized and mechanically ventilated with fixed minute volume. Each animal underwent a 1 cm abdominal longitudinal incision. A 0.3 x 0.3 cm cross-incision of the stomach was performed in the perforation group but not in the controls (n=10/group), and the abdomen was closed in both groups. After stabilization, CO2-induced pneumoperitoneum was established at 0, 5, 8 and 12 mmHg for 20 min periods consecutively, each followed by complete pressure relief for 5 minutes.

Results: Ventilatory pressure increased in both groups when pneumoperitoneal pressure ≥ 5 mmHg was applied, but more so in the perforated stomach group (P = 0.003). ETCO2 increased in both groups during the experiment, but less so in the perforated group (P = 0.04). It then returned to near baseline values during pressure annulation in all perforated animals but only in the 0 and 5 mmHg periods in the controls.

Conclusions: When subjected to pneumoperitoneum, ETCO2 was lower in rats with a perforated stomach than in those with an intact stomach. An abrupt decrease in ETCO2 during laparoscopy may signal gastric perforation.
 

E. Ben-Chetrit, C. Chen-Shuali, E. Zimran, G. Munter and G. Nesher

Background: Frequent readmissions significantly contribute to health care costs as well as work load in internal medicine wards.

Objective: To develop a simple scoring method that includes basic demographic and medical characteristics of  elderly patients in internal medicine wards, which would allow prediction of readmission within 3 months of discharge.

Methods: We conducted a retrospective observational study of 496 hospitalized patients using data collected from discharge letters in the computerized archives. Univariate and multivariate logistic regression analyses were performed and factors that were significantly associated with readmission were selected to construct a scoring tool. Validity was assessed in a cohort of 200 patients.

Results: During a 2 year follow-up 292 patients were readmitted at least once within 3 months of discharge. Age 80 or older, any degree of impaired cognition, nursing home residence, congestive heart failure, and creatinine level > 1.5 mg/dl were found to be strong predictors of readmission. The presence of each variable was scored as 1. A score of 3 or higher in the derivation and validation cohorts corresponded with a positive predictive value of 80% and 67%, respectively, when evaluating the risk of rehospitalization.

Conclusions: We propose a practical, readily available five-item scoring tool that allows prediction of most unplanned readmissions within 3 months. The strength of this scoring tool, as compared with previously published scores, is its simplicity and straightforwardness.
 

September 2012
August 2012
T. Tohami, A. Nagler and N. Amariglio

Chronic myeloid leukemia (CML) is a clonal hematological disease that represents 15–20% of all adult leukemia cases. The study and treatment of CML has contributed pivotal advances to translational medicine and cancer therapy. The discovery that a single chromosomal abnormality, the Philadelphia (Ph) chromosome, is responsible for the etiology of this disease was a milestone for treating and understanding CML. Subsequently, CML became the first disease for which allogeneic bone marrow transplantation is the treatment of choice. Currently, CML is one of the few diseases where treatment targeted against the chromosomal abnormality is the sole frontline therapy for newly diagnosed patients. The use of directed therapy for CML challenged disease monitoring during treatment and led to the development of definitions that document response and predict relapse sooner than the former routine methods. These methods relied on classical cytogenetics through molecular cytogenetics (FISH) and, finally, on molecular monitoring assays. This review discusses the laboratory tools used for diagnosing CML, for monitoring during treatment, and for assessing remission or relapse. The advantages and disadvantages of each test, the common definition of response levels, and the efforts to standardize molecular monitoring for CML patient management are discussed.

July 2012
S. Giryes, E. Leibovitz, Z. Matas, S. Fridman, D. Gavish, B. Shalev, Z. Ziv-Nir, Y. Berlovitz and M. Boaz
Background: Depending on the definition used, malnutrition is prevalent among 20¨C50% of hospitalized patients. Routine nutritional screening is necessary to identify patients with or at increased risk for malnutrition. The Nutrition Risk Screening (NRS 2002) has been recommended as an efficient tool to identify the risk of malnutrition in adult inpatients.

Objectives: To utilize the NRS 2002 to estimate the prevalence of malnutrition among newly hospitalized adult patients, and to identify risk factors for malnutrition.

Methods: During a 5 week period, all adult patients newly admitted to all inpatient departments (except Maternity and Emergency) at Wolfson Medical Center, Holon, were screened using the NRS 2002. An answer of yes recorded for any of the Step 1 questions triggered the Step 2 screen on which an age-adjusted total score ¡Ý 3 indicated high malnutrition risk.

Results: Data were obtained from 504 newly hospitalized adult patients, of whom 159 (31.5%) were identified as high risk for malnutrition. Malnutrition was more prevalent in internal medicine than surgical departments: 38.6% vs. 19.1% (P < 0.001). Body mass index was within the normal range among subjects at high risk for malnutrition: 23.9 ¡À 5.6 kg/m2 but significantly lower than in subjects at low malnutrition risk: 27.9 ¡À 5.3 kg/m2 (P < 0.001). Malnutrition risk did not differ by gender or smoking status, but subjects at high malnutrition risk were significantly older (73.3 ¡À 16.2 vs. 63.4 ¡À 18.4 years, P < 0.001). Total protein, albumin, total cholesterol, low density lipoprotein-cholesterol, hemoglobin and %lymphocytes were all significantly lower, whereas urea, creatinine and %neutrophils were significantly higher in patients at high malnutrition risk.

Conclusions: Use of the NRS 2002 identified a large proportion of newly hospitalized adults as being at high risk for malnutrition. These findings indicate the need to intervene on a system-wide level during hospitalization.
G. Yahalom, A. Yagoda, C. Hoffmann, O. Dollberg and N. Gadoth
Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel