• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Thu, 18.07.24

Search results


December 2007
H.N. Baris, I. Kedar, G.J. Halpern, T. Shohat, N. Magal, M.D. Ludman and M. Shohat

Background: Fanconi anemia complementation group C and Bloom syndrome, rare autosomal recessive disorders marked by chromosome instability, are especially prevalent in the Ashkenazi* Jewish community. A single predominant mutation for each has been reported in Ashkenazi Jews: c.711+4A→T (IVS4 +4 A→T) in FACC[1] and BLMAsh in Bloom syndrome. Individuals affected by both syndromes are characterized by susceptibility for developing malignancies, and we questioned whether heterozygote carriers have a similarly increased risk.

Objectives: To estimate the cancer rate among FACC and BLMAsh carriers and their families over three previous generations in unselected Ashkenazi Jewish individuals.

Methods: We studied 42 FACC carriers, 28 BLMAsh carriers and 43 controls. The control subjects were Ashkenazi Jews participating in our prenatal genetic screening program who tested negative for FACC and BLMAsh. All subjects filled out a questionnaire regarding their own and a three-generation family history of cancer. The prevalence rates of cancer among relatives of FACC, BLMAsh and controls were computed and compared using the chi-square test.

Results: In 463 relatives of FACC carriers, 45 malignancies were reported (9.7%) including 10 breast (2.2%) and 13 colon cancers (2.8%). Among 326 relatives of BLMAsh carriers there were 30 malignancies (9.2%) including 7 breast (2.1%) and 4 colon cancers (1.2%). Controls consisted of 503 family members with 63 reported malignancies (12.5%) including 11 breast (2.2%) and 11 colon cancers (2.2%).

Conclusions: We found no significantly increased prevalence of malignancies among carriers in at least three generations compared to the controls.






* Jews of East European origin



[1] FACC = Fanconi anemia complementation group C


October 2007
H. Ring, O. Keren, M. Zwecker and A. Dynia

Background: With the development of computer technology and the high-tech electronic industry over the past 30 years, the technological age is flourishing. New technologies are continually being introduced, but questions regarding the economic viability of these technologies should be addressed.

Objectives: To identify the medical technologies that are currently in use in different rehabilitation medicine settings in Israel

Methods: The TECHNO-R 2005 survey was conducted in two phases. Beginning in 2004, the first survey used a questionnaire with open questions relating to the different technologies in clinical use, including questions on their purpose, who operates the device (technician, physiotherapist, occupational therapist, physician, etc.), and a description of the treated patients. This questionnaire was sent to 31 rehabilitation medicine facilities in Israel. Due to difficulties in comprehension of the term “technology,” a second revised standardized questionnaire with closed-ended questions specifying diverse technologies was introduced in 2005. The responder had to mark from a list of 15 different medical technologies which were in use in his or her facility, as well as their purpose, who operates the device, and a description of the treated patients.

Results: Transcutaneous electrical nerve stimulation, the TILT bed, continuous passive movement, and therapeutic ultrasound were the most widely used technologies in rehabilitation medicine facilities. Monitoring of the sitting position in the wheelchair, at the bottom of the list, was found to be the least used technology (with 15.4% occurrence). Most of the technologies are used primarily for treatment purposes and to a lesser degree for diagnosis and research.

Conclusions: Our study poses a fundamental semantic and conceptual question regarding what kind of technologies are or should be part of the standard equipment of any accredited rehabilitation medicine facility for assessment, treatment and/or research. For this purpose, additional data are needed.
 

R. Small, N. Lubezky and M. Ben-Haim

Surgical resection offers the best opportunity for cure in patients with colorectal cancer metastasis to the liver, with 5 year survival rates of up to 58% following resection. However, only a small percentage of patients are eligible for resection at the time of diagnosis and the average recurrence rate is still high. Consequently, research endeavors have focused on methods aimed to increase the number of patients eligible for surgical resection, refine the selection criteria for surgery, and improve the disease-free and overall survival time in these patients. Improvements in imaging techniques and the increasing use of FDG-PET allow more accurate preoperative staging and superior identification of patients likely to benefit from surgical resection. Advances in the use of neoadjuvant chemotherapy allows up to 38% of patients previously considered unresectable to be significantly downstaged and eligible for hepatic resection. Many reports have critically evaluated the surgical techniques applied to liver resection, the concurrent or alternative use of local ablative therapies, such as radiofrequency ablation, and the subsequent utilization of adjuvant chemotherapy in patients undergoing surgical resection for hepatic metastases.

D. Ergas, A. Abdul-Hai. Z. Sthoeger, B-H. Menahem and R. Miller
D.I. Nassie, A. Volkov, J. Kronenberg and Y.P. Talmi
September 2007
J. Haik, A. Liran, A. Tessone, A. Givon, A. Orenstein and K. Peleg

Background: Burns are a major public health problem, with long hospitalization stay in both intensive care units and general wards. In Israel about 5% of all hospitalized injuries are burn injuries. There are no long-term epidemiological studies on burn injuries in adults in Israel.

Objectives: To identify risk factors for burn injuries and provide a starting point for the establishment of an effective prevention plan.

Methods: We analyzed the demographic, etiologic and clinical data of 5000 burn patients admitted to the five major hospitals with burn units in Israel during a 7 year period (1997–2003). Data were obtained from the records of the Israeli National Trauma Registry. The differences between various groups were evaluated using the chi-square test.

Results: Male gender was twice as frequent as female gender in burn patients (68.0% vs. 31.9%), and Jewish ethnicity was more common than non-Jewish (62.3% vs. 36.8%). Second and third-degree burns with body surface areas less than 10% constituted the largest group (around 50%). The largest age group was 0–1 years, constituting 22.2% of the cases. Inhalation injury was uncommon (1.9%). The most common etiologies were hot liquids (45.8%) and open fire (27.5%). Children less than 10 years old were burnt mainly by hot liquids while the main cause of burns for adults > 20 years old was an open flame. The majority of burns occurred at home (58%); around 15% were work related. The mean duration of hospitalization was 13.7 days (SD 17.7); 15.5% were in an intensive care unit with a mean duration of 12.1 days (SD 17.1). Surgical procedures became more common during the period of the study (from 13.4% in 1998 to 26.59% in 2002, average 19.8%). The mortality rate was 4.4%. We found a strong correlation between burn degree and total body surface area and mortality (0.25% mortality for 2nd to 3rd-degree burns with less than 10% TBSA[1], 5.4% for 2nd to 3rd-degree burns with 20–39% TBSA, and 96.6% for burns > 90% TBSA). The worst prognosis was for those over the age of 70 (mortality rate 35.3%) and the best prognosis was for the 0–1 year group (survival rate 99.6%).

Conclusions: The groups at highest risk were children 0–1 years old, males and non-Jews (the incidence rate among non-Jews was 1.5 times higher than their share in the general population). Those with the highest mortality rate were victims of burns > 90% TBSA and patients older than 70. Most burns occurred at home.






[1] TBSA = total body surface area


July 2007
O.Tavor, M.Shohat and S.Lipitz

Background: The measurement of maternal serum human chorionic gonadotropin as a predictor of fetuses with Down syndrome has been in use since 1987.

Objectives: To determine the correlation between extremely high levels of hCG[1] at mid-gestation and maternal and fetal complications.

Methods: The study group consisted of 75 pregnant women with isolated high levels of hCG (> 4 MOM) at mid-gestation, and the control group comprised 75 randomly selected women with normal hCG levels (as well as normal alpha-fetoprotein and unconjugated estriol levels). The data collected included demographic details, fetal anomalies, chromosomal aberrations, pregnancy complications, and results of neonatal tests.

Results: There was a significant increase in the frequency of fetal anomalies (detected by ultrasound), low birth weight and neonatal complications in the study group. We also found an increased rate of fetal/neonatal loss proportional to the increasing levels of hCG (up to 30% in levels exceeding 7 MOM).

Conclusion: Our study demonstrated an increased frequency of obstetric complications that was closely associated with raised hCG levels. The study also raises questions about the accuracy of the Down syndrome probability equation in the presence of extremely high levels of hCG where data on the frequency of Down syndrome is severely limited.






[1] hCG = human chorionic gonadotropin


D.Lotan, G.Yoskovitz, L.Bisceglia, L.Gerad, H.Reznik-Wolf and E.Pras

Background: Cystinuria is an autosomal recessive disease that is manifested by kidney stones   and is caused by mutations in two genes: SLC3A1 on chromosome 2p and SLC7A9 on chromosome 19q. Urinary cystine levels in obligate carriers are often, but not always, helpful in identifying the causative gene.

Objectives: To characterize the clinical features and analyze the genetic basis of cystinuria in an inbred Moslem Arab Israeli family.

Methods: Family members were evaluated for urinary cystine and amino acid levels. DNA was initially analyzed with polymorphic markers close to the two genes and SLC7A9 was fully sequenced.

Results: Full segregation was found with the marker close to SLC7A9. Sequencing of this gene revealed a missense mutation, P482L, in the homozygous state in all three affected sibs.

Conclusions: A combination of urinary cystine levels in obligate carriers, segregation analysis with polymorphic markers, and sequencing can save time and resources in the search for cystinuria mutations.
 

O.Scheuerman, L.de Beaucoudrey, V.Hoffer, J.Feinberg, J.L.Casanova, and B.Z.Garty
June 2007
.T. Handzel, V. Barak, Y. Altman, H. Bibi, M. Lidgi, M. Iancovici-Kidon, D. Yassky, M. Raz

Background: The global spread of tuberculosis necessitates the development of an effective vaccine and new treatment modalities. That requires a better understanding of the differences in regulation of the immune responses to Mycobacterium tuberculosis between individuals who are susceptible or resistant to the infection. Previous immune studies in young Ethiopian immigrants to Israel did not demonstrate anergy to purified protein derivative or a Th2-like cytokine profile.

Objectives: To evaluate the profile of Th1 and Th2 cytokine production in immigrant TB patients, in comparison with asymptomatic control subjects.

Methods: The present study included (part 1): 39 patients with acute TB[1] (group 1), 34 patients with chronic relapsing TB (group 2), 39 Mantoux-positive asymptomatic TB contacts (group 3), and 21 Mantoux-negative asymptomatic controls (group 4). Patients were mainly immigrants from Eastern Europe and Ethiopia. Levels of interferon gamma, interleukin 2 receptor, IL-6[2] and IL-10 were measured in serum and in non-stimulated and PPD[3]-stimulated peripheral blood mononuclear cell culture supernatants, using commercial ELISA kits. In addition (part 2), levels of IFNg[4] and IL-12p40 were evaluated in 31 immigrant Ethiopian patients and 58 contact family members.

Results: Patients with acute disease tended to secrete more cytokines than contacts, and contacts more than chronic patients and controls, without a specific bias. None of the patients showed in vitro anergy. Discriminant probability analysis showed that from the total of 12 available parameters, a cluster of 6 (IFNg-SER[5], IFNg-PPD, IL-2R[6]-SER, IL-10-SER, IL-10-NS[7] and IL-6-PPD) predicted an 84% probability to become a TB contact upon exposure, 71% a chronic TB patient and 61% an acute TB patient. Family-specific patterns of IFNg were demonstrated in the second part of the study.

Conclusions: Firstly, no deficiency in cytokine production was demonstrated in TB patients. Secondly, acute TB patients secreted more cytokines than contacts, and contacts more than unexposed controls. Thus, neither anergy nor a cytokine dysregulation explains susceptibility to acute TB disease in our cohort, although chronic TB patients produced less cytokines than did acute patients and less than asymptomatic contacts. Thirdly, a certain cytokine configuration may predict a trend of susceptibility to acquire, or not acquire, clinical TB. It is presently unclear whether this finding may explain the disease spread in large populations. Finally, the familial association of IFNg secretion levels probably points towards a genetic regulation of the immune response to Mycobacterium tuberculosis. 

 






[1] TB = tuberculosis

[2] IL = interleukin

[3] PPD = purified protein derivative

[4] IFNγ = interferon-gamma

[5] SER = serum

[6] IL-2R = interleukin 2 receptor

[7] NS = non-stimulated


May 2007
I. Gotsman, A. Meirovitz, N. Meizlish, M. Gotsman, C. Lotan and D. Gilon

Background: Infective endocarditis is a common disease with significant morbidity and mortality.

Objectives: To define clinical and echocardiographic parameters predicting morbidity and in-hospital mortality in patients with infective endocarditis hospitalized in a tertiary hospital from 1991 to 2000.

Methods: All patients with definite IE diagnosed according to the Duke criteria were included. We examined relevant clinical features that might influence outcome.

Results: The study group comprised 100 consecutive patients, 77 with native valve and 23 with prosthetic valve endocarditis. The overall in-hospital mortality rate was 8%. There was a higher mortality in the PVE[1] group compared to the NVE[2] group (13% vs. 7%, P = 0.07). The mortality rate in each group, with or without surgery, was not significantly different. Clinical predictors of mortality were older age and hospital-acquired endocarditis. The presence of vegetations and their size were significant predictors of major embolic events and mortality. Staphylococcus aureus was a predictor of mortality (25% vs. 5%, P < 0.005) and abscess formation. Multivariate logistic analysis identified vegetation size and S. aureus as independent predictors of mortality.

Conclusions: Mortality is higher in older hospitalized patients. S. aureus is associated with a poor outcome. Vegetation size is an independent predictor of embolic events and of a higher mortality.







[1]PVE = prosthetic valve endocarditis

[2]NVE = native valve endocarditis


R. Grossman, Z. Ram, A. Perel, Y. Yusim, R. Zaslansky and H. Berkenstadt

Background: Pain following brain surgery is a significant problem. Infiltration of the scalp with local intradermal anesthetics was suggested for postoperative pain control but was assessed only in the first hour postoperatively.


Objectives: To evaluate wound infiltration with a single dose of metamizol (dipyrone) for postoperative pain control in patients undergoing awake craniotomy.


Methods: This open, prospective, non-randomized observational study, conducted in anesthesiology and neurosurgical departments of a teaching hospital, included 40 patients undergoing awake craniotomy for the removal of brain tumor. Intraoperative anesthesia included wound infiltration with lidocaine and bupivacaine, conscious sedation using remifentanil and propofol, and a single dose of metamizol (dipyrone) for postoperative pain control. Outcome was assessed by the Numerical Pain Scale on arrival at the postoperative care unit, and 2, 4 and 12 hours after the end of surgery.


Results: On arrival at the postoperative care unit, patients reported NPS[1] scores of 1.2 ± 1.1 in a scale of 0–10 (mean ± SD) (median = 1, range 0–4). The scores were 0.8 ± 0.9, 0.9 ± 0.9, and 1 ± 0.9 at 2 hours, 4 hours, and 12 hours after the end of surgery, respectively. Based on patients' complaints and NPS lower then 3, 27 patients did not require any supplementary analgesia during the first 12 postoperative hours, 11 patients required a single dose of oral metamizol or intramuscular diclofenac, one patient was given 2 mg of intravenous morphine, and one patient required two separate doses of metamizol.

Conclusions: Although the clinical setup prevents the use of placebo local analgesia as a control group, the results suggest the possible role of local intradermal infiltration of the scalp combined with a single dose of metamizol to control postoperative pain in patients undergoing craniotomy.







[1] NPS = Numerical Pain Scale


Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel