• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Thu, 18.07.24

Search results


May 2012
O. Wolf, M. Westreich and A. Shalom

Background: There are two main approaches to breast reduction surgery today: the traditional long scar ("Wise-pattern") technique and the more recent short ("vertical") scar technique, which is becoming more popular. During the last two decades there has been a gradual shift between the two techniques, including in our institute.

Objectives: To evaluate the evidence behind this obvious trend.

Methods: We retrospectively collected data from archived hospital charts of all patients who underwent breast reduction surgery during the period 1995–2007. Epidemiological, clinical and postoperative data were analyzed and compared between patients who were operated on by means of the short scar vs. the long scar techniques.

Results: During the study period 91 patients underwent breast reduction surgery in our department: 34 with the Wise-pattern breast reduction technique and 57 with the short-scar procedure. There was no significant difference in operative and postoperative data, including length of hospital stay. In some of the categories there was even a slight advantage (but not statistically significant) to the former. The only significant difference was the size of reduction, with a tendency to prefer the long scar technique for larger reductions; however, with gained experience the limit for short scar reductions was gradually extended to a maximum of 1470 g.

Conclusions: We noticed a sharp increase in the safe and uneventful practice of the short scar technique in breast reduction in our institute for removing ≤ 1400 g – especially in young women without extreme ptosis. This observation, together with other advantages, namely, reduced scar length, prolonged shape preservation and better breast projection, support use of this technique.
 

September 2011
I.N. Kochin, T.A Miloh, R. Arnon, K.R. Iyer, F.J Suchy and N. Kerkar

Background: Primary liver masses in children may require intervention because of symptoms or concern about malignant transformation.

Objectives: To review the management and outcomes of benign liver masses in children. Methods: We conducted a retrospective chart review of children with liver masses referred to our institution during the period 19972009.

Results: Benign liver masses were identified in 53 children. Sixteen of these children (30%) had hemangioma/infantile hepatic hemangioendothelioma (IHH) and 15 (28%) had focal nodular hyperplasia. The remainder had 6 cysts, 4 hamartomas, 3 nodular regenerative hyperplasia, 2 adenomas, 2 vascular malformations, and one each of polyarteritis nodosa, granuloma, hepatic hematoma, lymphangioma, and infarction. Median age at presentation was 6 years, and 30 (57%) were female. Masses were initially noticed on imaging studies performed for unrelated symptoms in 33 children (62%), laboratory abnormalities consistent with liver disease in 11 (21%), and palpable abdominal masses in 9 (17%). Diagnosis was made based on characteristic radiographic findings in 31 (58%), but histopathological examination was required for the remaining 22 (42%). Of the 53 children, 27 (51%) were under observation while 17 (32%) had masses resected. Medications targeting masses were used in 9 (17%) and liver transplantation was performed in 4 (8%). The only death (2%) occurred in a child with multifocal IHH unresponsive to medical management and prior to liver transplant availability.

Conclusions: IHH and focal nodular hyperplasia were the most common lesions. The majority of benign lesions were found incidentally and diagnosed radiologically. Expectant management was sufficient in most children after diagnosis, although surgical intervention including liver transplant was occasionally necessary.
 

October 2010
A. Blatt, R. Svirski, G. Morawsky, N. Uriel, O. Neeman, D. Sherman, Z. Vered and R. Krakover

Background: Little is known of the outcome of pregnant patients with previously diagnosed dilated cardiomyopathy. These patients are usually firmly advised against continuation of the pregnancy.

Objectives: To examine the usefulness of serial echocardiographic follow-up and plasma N-terminal pro-B type natriuretic peptide levels in the management of pregnant women with preexisting DCM[1].

Methods: We prospectively enrolled pregnant women with DCM either known or diagnosed in the first trimester. Clinical examination and serial echocardiography studies at baseline, 30 weeks gestation, peripartum, and 3 and 18 months postpartum were performed. Blinded NTproBNP[2] levels were obtained at 30 weeks, delivery and 3 months postpartum.

Results: Between June 2005 and October 2006 we enrolled seven women who fulfilled the study criteria. Delivery and postpartum were complicated in 3 patients (42%): 2 with acute heart failure, which resolved conservatively, and 1 with major pulmonary embolism. The left ventricular ejection fraction was stable throughout the pregnancy (35% ± 2.8 at baseline, 33% ± 2.9 at 30 weeks) and postpartum (35% ± 2.8 at 1 day, 34% ± 3.1 at 90 days). Similar stable behavior was observed regarding left ventricular dimensions: LV[3] end-systolic diameters 43.3 ± 2.7 mm and LV end-diastolic diameters 57.3 ± 3.3 mm at baseline compared with 44.1 ± 3.1 mm and 58.7 ± 3.1 mm postpartum, respectively. The NT-ProBNP levels rose significantly peripartum in all three patients with complications.

Conclusions: Serial NT-proBNP levels, as compared to echocardiography, may be a better clinical tool in monitoring and management of pregnant women with preexisting DCM. An early rise in NT-ProBNP level appears to predict the occurrence of adverse events.






[1] DCM = dilated cardiomyopathy



[2] NTproBNP = N-terminal pro-B type natriuretic peptide



[3] LV = left ventricular


July 2010
M. Haddad, G. Rubin, M. Soudry and N. Rozen

Background: There is controversy as to which is the preferred treatment for distal radius intra-articular fractures – anatomic reduction or external fixation.

Objectives: To evaluate the radiologic and functional outcome following external fixation of these fractures.

Methods: Between January 2003 and March 2005, 43 patients with distal radius intra-articular fractures were treated using a mini-external AO device. Follow-up of 38 of the patients included X-rays at 1 week, 6 weeks and 6 months postoperatively. The Visual Analogue Scale was used to assess pain levels, and the Lidstrom criteria scale to evaluate functional outcome and wrist motion. Clinical and radiographic results were correlated.

Results: According to the Lidstrom criteria, the results were excellent in 31%, good in 61% and fair in 5.5%; 2.5% had a poor outcome. The results of the VAS[1] were good. Thirty-five patients gained a good range of wrist movement, but 3 had a markedly reduced range. We found statistical correlation between the radiographic and clinical results, emphasizing the value of good reduction. There was no correlation between fracture type (Frykman score) and radiologic results or clinical results.

Conclusions: External fixation seems to be the preferred method of treatment for distal radius intra-articular fractures, assuming that good reduction can be achieved. The procedure is also quick, the risk of infection is small, and there is little damage to the surrounding tissues.

 






[1] VAS = Visual Analogue Scale


April 2009
Shlomo Cohen-Katan, B Med Sc, Nitza Newman-Heiman, MD, Orna Staretz-Chacham, MD, Zahavi Cohen, MD, Lily Neumann, PhD and Eilon Shany, MD.

Background: Despite progress in medical and surgical care the mortality rate of congenital diaphragmatic hernia remains high. Assessment of short-term outcome is important for comparison between different medical centers.

Objectives: To evaluate the short-term outcome of infants born with symptomatic CDH[1] and to correlate demographic and clinical parameters with short-term outcome.

Methods: We performed a retrospective cohort study in which demographic, obstetric and perinatal characteristics were extracted from infants' files. For comparison of categorical variables chi-square test and Fisher's exact test were used and for continuous variables with categorical variables the Mann-Whitney test was used. Sensitivity and specificity were estimated by receiver operator curve.

Results: The study group comprised 54 infants with CDH, of whom 20 (37%) survived the neonatal period. Demographic characteristics were not associated with survival. Regarding antenatal characteristics, absence of polyhydramnion and postnatal diagnosis were correlated with better survival. Apgar scores (above 5 at 1 minute and 7 at 5 minutes), first arterial pH after delivery (above 7.135) and presence of pulmonary hypertension were significantly correlated with survival. Also, infants surviving up to 6 days were 10.71 times more likely to survive the neonatal period.

Conclusions: The survival rate of symptomatic newborns with CDH at our center was 37% for the period 1988–2006. Prenatal diagnosis, Apgar score at 5 minutes and first pH after delivery were found to be the most significant predictors of survival. Prospective work is needed to evaluate the long-term outcome of infants with CDH.






*This work was part of the MD thesis of Shlomo Cohen-Katan

[1] CDH = congenital diaphragmatic hernia


April 2008
A. Vivante, N. Hirshoren, T. Shochat and D. Merkel

Background: Iron deficiency and lead poisoning are common and are often associated. This association has been suggested previously, mainly by retrospective cross-sectional studies.

Objective: To assess the impact of short-term lead exposure at indoor firing ranges, and its relationship to iron, ferritin, lead, zinc protoporphyrin, and hemoglobin concentrations in young adults.

Methods: We conducted a clinical study in 30 young healthy soldiers serving in the Israel Defense Forces. Blood samples were drawn for lead, zinc protoporphyrin, iron, hemoglobin and ferritin prior to and after a 6 week period of intensive target practice in indoor firing ranges.

Results: After a 6 week period of exposure to lead dust, a mean blood lead level increase (P < 0.0001) and a mean iron (P < 0.0005) and mean ferritin (P < 0.0625) decrease occurred simultaneously. We found a trend for inverse correlation between pre-exposure low ferritin levels and post-exposure high blood lead levels.

Conclusions: The decrease in iron and ferritin levels after short-term lead exposure can be attributed to competition between iron and lead absorption via divalent metal transport 1, suggesting that lead poisoning can cause iron depletion and that iron depletion can aggravate lead poisoning. This synergistic effect should come readily to every physician's mind when treating patients with a potential risk for each problem separately.
 

September 2007
J. Baron, D. Greenberg, Z. Shorer, E. Herskhovitz, R. Melamed and M. Lifshitz
December 2006
A. Nemets, I. Isakov, M. Huerta, Y. Barshai, S. Oren and G. Lugassy
 Background: Thrombosis is a major cause of morbidity and mortality in polycythemia vera. Hypercoagulability is principally due to hyperviscosity of the whole blood, an exponential function of the hematocrit. PV[1] is also associated with endothelial dysfunction that can predispose to arterial disease. Reduction of the red cell mass to a safe level by phlebotomy is the first principle of therapy in PV. This therapy may have some effect on the arterial compliance in PV patients.

Objectives: To estimate the influence of phlebotomies on large artery (C1) and small artery compliance (C2) in PV patients by using non-invasive methods.

Methods: Short-term hemodynamic effects of phlebotomy were studied by pulse wave analysis using the HDI-Pulse Wave CR2000 (Minneapolis, MN, USA) before and immediately after venesection (300–500 ml of blood). We repeated the evaluation after 1 month to measure the long-term effects.

Results: Seventeen PV patients were included in the study and 47 measurements of arterial compliance were performed: 37 for short-term effects and 10 for long-term effects. The mean large artery compliance (C1) before phlebotomy was 12.0 ml/mmHg x 10 (range 4.5–28.6), and 12.6 ml/mmHg x 10 (range 5.2–20.1) immediately after phlebotomy (NS). The mean small artery compliance (C2) before and immediately after phlebotomy were 4.4 mg/mmHg x 10 (range 1.2–14.3) and 5.5 mg/mmHg x 10 (range 1.2–15.6) respectively (delta C2–1.1, P < 0.001). No difference in these parameters could be demonstrated in the long-term arm.

Conclusions: Phlebotomy immediately improves arterial compliance in small vessels of PV patients, but this effect is short lived.


 





[1] PV = polycythemia vera


June 2006
A. Glick, Y. Michowitz, G. Keren and J. George
 Background: Cardiac resynchronization therapy is a modality with proven morbidity and mortality benefit in advanced systolic heart failure. Nevertheless, not all patients respond favorably to CRT[1]. Natriuretic peptides and inflammatory markers are elevated in congestive heart failure and reflect disease severity.

Objectives: To test whether an early change in neurohormonal and inflammatory markers after implantation can predict the clinical response to CRT.

Methods: The study group included 32 patients with advanced symptomatic systolic heart failure and a prolonged QRS complex and who were assigned to undergo CRT. Baseline plasma levels of B-type natriuretic peptide and high sensitivity C-reactive protein were determined in the peripheral venous blood and coronary sinus. Post-implantation levels were determined 2 weeks post-procedure in the PVB[2]. Baseline levels and their change in 2 weeks were correlated with all-cause mortality and hospitalization for congestive heart failure.

Results: At baseline, coronary sinus levels of BNP[3] but not hsCRP[4] were significanly elevated compared to the PVB. Compared to baseline levels, BNP and hsCRP decreased significantly within 2 weeks after the implantation (BNP mean difference 229.1 ± 102.5 pg/ml, 95% confidence interval 24.2–434, P < 0.0001; hsCRP mean difference 5.2 ± 2.4 mg/dl, 95% CI[5] 0.3–10.1, P = 0.001). During a mean follow-up of 17.7 ± 8.2 months 6 patients died (18.7%) and 12 (37.5%) were hospitalized due to exacerbation of CHF[6]. Baseline New York Heart Association and CS[7] BNP levels predicted CHF-related hospitalizations. HsCRP levels or their change over 2 weeks did not predict all-cause mortality or hospitalizations.

Conclusions: BNP levels in the CS and peripheral venous blood during biventricular implantation and 2 weeks afterwards predict cilinical response and may guide patient management.


 





[1] CRT = cardiac resynchronization therapy

[2] PVB = peripheral venous blood

[3] BNP = B-type natriuretic peptide

[4] hs-CRP = high sensitivity C-reactive protein

[5] CI = confidence interval

[6] CHF = congestive heart failure

[7] CS = coronary sinus


June 2005
R. Ben-Ami, Y. Siegman-Igra, E. Anis, G.J. Brook, S. Pitlik, M. Dan and M. Giladi
 Background: Short trips to holiday resorts in Mombassa, Kenya, have gained popularity among Israelis since the early 1990s. A cluster of cases of malaria among returned travelers raised concern that preventive measures were being neglected.

Objectives: To characterize the demographic and clinical features of malaria acquired in Kenya, and to assess the adequacy of preventive measures.

Methods: Data were collected from investigation forms at the Ministry of Health. All persons who acquired malaria in Kenya during the years 1999–2001 were contacted by phone and questioned about use of chemoprophylaxis, attitudes towards malaria prevention, and disease course. Further information was extracted from hospital records.

Results: Kenya accounted for 30 of 169 (18%) cases of malaria imported to Israel, and was the leading source of malaria in the study period. Of 30 malaria cases imported from Kenya, 29 occurred after short (1–2 weeks) travel to holiday resorts in Mombassa. Average patient age was 43 ± 12 years, which is older than average for travelers to tropical countries. Only 10% of the patients were fully compliant with malaria chemoprophylaxis. The most common reason for non-compliance was the belief that short travel to a holiday resort carries a negligible risk of malaria. Only 3 of 13 patients (23%) who consulted their primary physician about post-travel fever were correctly diagnosed with malaria. Twenty percent of cases were severe enough to warrant admission to an intensive care unit; one case was fatal.

Conclusions: Measures aimed at preventing malaria and its severe sequelae among travelers should concentrate on increasing awareness of risks and compliance with malaria chemoprophylaxis.

March 2003
I. Sukhotnik, L. Siplovich, M.M. Krausz and E. Shiloni

Intestinal adaptation is the term applied to progressive recovery from intestinal failure following a loss of intestinal length. The regulation of intestinal adaptation is maintained through a complex interaction of many different factors. These include nutrients and other luminal constituents, hormones, and peptide growth factors. The current paper discusses the role of peptide growth factors in intestinal adaptation following massive small bowel resection. This review focuses on the mechanisms of action of peptide growth factors in intestinal cell proliferation, and summarizes the effects of these factors on intestinal regrowth in an animal model of short bowel syndrome.

Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel