Aim: The aim of this study was to assess the density of the segmental branches of the middle cerebral artery (MCA) quantitatively as a predictor of acute ischemic stroke in patients without definitive infarct findings at cerebral parenchyma by non-contrast computed tomography (CT).
Clinical rationale for the study: The clinical rationale for the study is to evaluate if the measurement of Sylvian fissure dot sign (SDS) would help early management of patients with stroke at the emergency department.
Methods: Computed tomography scans of 101 patients admitted to the emergency department with stroke symptoms and/or signs were included in the study, retrospectively. In the patient group, the quantitative density of the segmental branches of the MCA in the Sylvian fissure was measured on the affected side and the contralateral side.
Results: Quantitative density of SDS was significantly higher on the ischemic side of the brain. Receiver operating characteristic (ROC) analysis showed a cut-off value of 38.5 Hounsfield units (HU) as a predictor for acute ischemic stroke, with a sensitivity and specificity of 79% and 92%, respectively.
Conclusion: Quantitative density of SDS on the affected side in patients without definitive cerebral infarct findings of parenchyma can be used in the emergency room as an objective predictor sign for the diagnosis of acute ischemic stroke. Considering this finding in the differential diagnosis of acute stroke patients in the emergency room has the potential to improve their clinical management, particularly for the patients without early parenchymal and vascular signs of stroke.
Experimental pain stimuli can be used to simulate patients’ pain experience. We review recent developments in psychophysical pain testing, focusing on the application of the dynamic tests—conditioned pain modulation (CPM) and temporal summation (TS). Typically, patients with clinical pain of various types express either less efficient CPM or enhanced TS, or both. These tests can be used in prediction of incidence of acquiring pain and of its intensity, as well as in assisting the correct choice of analgesic agents for individual patients. This can help to shorten the commonly occurring long and frustrating process of adjusting analgesic agents to the individual patients. We propose that evaluating pain modulation can serve as a step forward in individualizing pain medicine.
Effects of early life psychosocial adversity have received a great deal of attention, such as maternal separation in experimental animal models and abuse/neglect in young humans. More recently, long-term effects of the physical stress of repetitive procedural pain have begun to be addressed in infants hospitalized in neonatal intensive care. Preterm infants are more sensitive to pain and stress, which cannot be distinguished in neonates. The focus of this review is clinical studies of long-term effects of repeated procedural pain-related stress in the neonatal intensive care unit (NICU) in relation to brain development, neurodevelopment, programming of stress systems, and later pain sensitivity in infants born very preterm (24–32 weeks’ gestational age). Neonatal pain exposure has been quantified as the number of invasive and/or skin-breaking procedures during hospitalization in the NICU. Emerging studies provide convincing clinical evidence for an adverse impact of neonatal pain/stress in infants at a time of physiological immaturity, rapidly developing brain microstructure and networks, as well as programming of the hypothalamic-pituitary-adrenal axis. Currently it appears that early pain/stress may influence the developing brain and thereby neurodevelopment and stress-sensitive behaviors, particularly in the most immature neonates. However, there is no evidence for greater prevalence of pain syndromes compared to children and adults born healthy at full term. In addressing associations between pain/stress and outcomes, careful consideration of confounding clinical factors related to prematurity is essential. The need for pain management for humanitarian care is widely advocated. Non-pharmacological interventions to help parents reduce their infant’s stress may be brain-protective.
This review explores the potential overlap between the fields of nutrition and therapeutic humor, together with the role of humor as a possible tool for aiding those in whom emotions, particularly negative ones, trigger eating as a means to improve mood. We review emotional eating, obesity, and the hypothesized mechanisms of emotional eating. We then review the field of therapeutic humor and its ability to de-stress individuals, possibly through endorphin and opioid systems, both of which are also involved in eating behavior. Finally, we present a novel hypothesis that people may be trained to use humor as a “food substitute” at best, or to blunt hunger stimuli, to achieve similar advantages, without the side effect of weight gain.
Transoral laser microsurgery (TLM) was pioneered in the early 1970s as an approach to treat laryngeal pathology with precision and minimal thermal damage to the vocal cords. Over the last four decades, TLM has become an integral part of the treatment paradigm for patients with laryngeal cancer. TLM is one of the primary treatment options for early-stage laryngeal tumors. However, in recent years, surgeons have begun to develop TLM into a more versatile approach which can be used to address advanced laryngeal tumors. Although functional outcomes following TLM for advanced laryngeal disease are scarce, survival outcomes appear to be comparable with those reported for organ preservation strategies employing external beam radiation therapy (EBRT) and chemotherapy. In addition, TLM plays an important role in the setting of recurrent laryngeal cancer following primary irradiation. TLM has been demonstrated to decrease the need for salvage total laryngectomy resulting in improved functionality while retaining comparable oncologic outcomes. The aims of this review are to elucidate the indications, techniques, and oncological outcomes of TLM for advanced laryngeal cancers.
Background. Spermatocytic seminoma is a rare testicular malignancy, appearing in the adult population. It has a good prognosis and a low rate of metastatic potential.
Objectives. We present five cases diagnosed and treated with radiotherapy at Rambam Health Care Campus in Haifa, Israel.
Methods. Between 1974 and 1996, five patients with stage I spermatocytic seminoma were referred post-orchiectomy to the Northern Israel Oncology Center. All five patients presented with the typical pathological features of the spermatocytic variant of classic seminoma, and all were staged clinically and radiologically.
Results. Mean age at diagnosis was 44 years (range 30–58 years). Main symptoms included a palpable testicular mass and/or testicular enlargement. Mean duration of symptoms was 9 months (range 0.5–24 months). Three patients were irradiated to the para-aortic/ipsilateral iliacal lymph nodes (mean total dose 2,500 cGy), one patient with 4,000 cGy. One patient was irradiated to the bilateral iliacal lymph nodes (2,600 cGy). With a median follow-up of 15 years, four patients are alive with no evidence of disease or severe late side effects. One patient developed severe lymphedema and symptomatic peripheral vascular disease, stage IIA prostate carcinoma (hormonal and brachytherapy treatment) and a non-secretory hypophyseal adenoma (surgically removed); he died at the age of 75 due to severe peripheral vascular and coronary heart disease with no evidence of his first or second primaries.
Conclusions. Prognosis is excellent and does not differ from classic seminoma. As in the accumulated experience in early-stage, low-risk classic seminoma, we suggest surveillance as the preferred policy.
Venous thromboembolism (VTE), the third most frequent acute cardiovascular syndrome, may cause life-threatening complications and imposes a substantial socio-economic burden. During the past years, several landmark trials paved the way towards novel strategies in acute and long-term management of patients with acute pulmonary embolism (PE). Risk stratification is increasingly recognized as a central cornerstone for an adequate diagnostic and therapeutic management of the highly heterogeneous population of patients with acute PE. Recently published European Guidelines emphasize the importance of clinical prediction rules in combination with imaging procedures (assessment of right ventricular function) and laboratory biomarkers (indicative of myocardial stress or injury) for identification of normotensive PE patients at intermediate risk for an adverse short-term outcome. In this patient group, systemic full-dose thrombolysis was associated with a significantly increased risk of intracranial bleeding, a complication which discourages its clinical application unless hemodynamic decompensation occurs. A large-scale clinical trial program evaluating new oral anticoagulants in the initial and long-term treatment of venous thromboembolism showed at least comparable efficacy and presumably increased safety of these drugs compared to the current standard treatment. Research is continuing on catheter-directed, ultrasound-assisted, local, low-dose thrombolysis in the management of intermediate-risk PE.
The expanding impact of chronic kidney disease (CKD) due to pandemic diabetes mellitus is recounted emphasizing its epidemiology that has induced global socioeconomic stress on health care systems in industrialized nations now attempting to proffer optimal therapy for end stage renal disease (ESRD). Strategies to delay and perhaps prevent progression of diabetic nephropathy from minimal proteinuria through nephrotic range proteinuria and azotemia to ESRD appear to have decreased the rate of persons with diabetes who develop ESRD. For those with ESRD attributed to diabetes, kidney transplantation affords better survival and rehabilitation than either hemodialysis or peritoneal dialysis. It is likely that advances in genetics and molecular biology will suggest early interventions that will preempt diabetic complications including renal failure.
All possible pro and con arguments regarding the theory of evolution have been discussed and debated in the vast literature—scientific, religious, and lay—in the past 150 years. There is usually great zealotry in all debating parties, with mutual intolerance of ideas and concepts, disrespect toward opposing opi-nions and positions, and usage of very harsh language. This prejudiced approach usually does not allow for a reasonable debate. It is important to look at the facts, assumptions, and beliefs of the theory of evolution in a more calm and humble way.In this article a comparative analysis is offered between the scientific aspects of the theory of evolution and a Judaic approach to these aspects.The two sets of human thought—religion and science—are fundamentally different in their aims and purposes, in their methods of operation, in their scope of interest and issues, and in their origin and ramifications. Whenever science surpasses its limits, or religion exceeds its boundaries, it actually is a form of an abuse of both. This has happened to the theory of evolution in a more powerful mode than any other interaction between science and religion.The agenda of many scientists who promote the theory of evolution is to achieve the goal of under-standing the existence of the universe as a random, purposeless, natural development, evolved slowly over billions of years from a common ancestor by way of natural selection, devoid of any supernatural metaphysical power.Jewish faith perceives the development of the universe in a different way: God created the world, with a purpose known to Him; He established natural laws that govern the world; and He imposed a moral-religious set of requirements upon Man.The discussion and comparative analysis in this article is based upon the current neo-Darwinian theory, although it seems almost certain that even the new and modern assumptions and speculations will continue to be challenged, changed, and revised as new scientific information will be discovered.The theory of evolution is based upon certain facts, many assumptions, speculations, and interpreta-tions, and some fundamental non-evidence-based beliefs.
The closest living relatives of humans are their chimpanzee/bonobo (Pan) sister species, members of the same subfamily “Homininae”. This classification is supported by over 50 years of research in the fields of chimpanzee cultural diversity, language competency, genomics, anatomy, high cognition, psy-chology, society, self-consciousness and relation to others, tool use/production, as well as Homo level emotions, symbolic competency, memory recollection, complex multifaceted problem-solving capabili-ties, and interspecies communication. Language competence and symbolism can be continuously bridged from chimpanzee to man. Emotions, intercommunity aggression, body language, gestures, fa-cial expressions, and vocalization of intonations seem to parallel between the sister taxa Homo and Pan. The shared suite of traits between Pan and Homo genus demonstrated in this article integrates old and new information on human–chimpanzee evolution, bilateral informational and cross-cultural exchange, promoting the urgent need for Pan cultures in the wild to be protected, as they are part of the cultural heritage of mankind. Also, we suggest that bonobos, Pan paniscus, based on shared traits with Austra-lopithecus, need to be included in Australopithecine‟s subgenus, and may even represent living-fossil Australopithecines. Unfolding bonobo and chimpanzee biology highlights our common genetic and cul-tural evolutionary origins.