Diarrhea, an illness of both the developed and developing world, involves the burdensome characteristics of frequent bowel movements, loose stools, and abdominal discomfort. Diarrhea is a long-standing challenge in palliative care and can have a myriad of causes, making symptomatic treatment pertinent when illness evaluation is ongoing, when there is no definitive treatment approach, or when effective treatment cannot be attained. Symptomatic therapy is a common approach in palliative care settings. Bismuth is a suitable agent for symptomatic therapy and can be effectively employed for management of chronic diarrhea. The objective of this narrative review is to examine the role of bismuth in management of diarrheal symptoms. To explore this, PubMed (including Medline) and Embase were used to search the existing literature on bismuth and diarrhea published from 1980 to 2019. It was found that bismuth has potential utility for diarrheal relief in multiple settings, including microscopic colitis, traveler’s diarrhea, gastrointestinal infection, cancer, and chemotherapy. It also has great potential for use in palliative care patients, due to its minimal side effects. Overall, the antisecretory, anti-inflammatory, and antibacterial properties of bismuth make it a suitable therapy for symptomatic treatment of diarrhea. The limited range of adverse side effects makes it an appealing option for patients with numerous comorbidities. Healthcare providers can explore bismuth as an adjunct therapy for diarrhea management in an array of conditions, especially in the palliative care setting.
The coronavirus disease 2019 (COVID-19) pandemic has remarkably challenged health care organizations and societies. A key strategy for confronting the disease implications on individuals and communities was based on harnessing multidisciplinary efforts to develop technologies for mitigating the disease spread and its deleterious clinical implications. One of the main challenging characteristics of COVID-19 is the provision of medical care to patients with a highly infective disease mandating the use of isolation measures. Such care is complicated by the need for complex critical care, dynamic treatment guidelines, and a vague knowledge regarding the disease’s pathophysiology. A second key component of this challenge was the over¬whelming surge in patient burden and the relative lack of trained staff and medical equipment which required rapid re-organization of large systems and augmenting health care efficiencies to unprecedented levels. In contrast to the risk management strategies employed to mitigate other serious threats and the billions of dollars that are invested in reducing these risks annually by governments around the world, no such preparation has been shown to be of effect during the current COVID-19 pandemic. Unmet needs were identified within the newly opened COVID-19 departments together with the urgent need for reliable information for effective decision-making at the state level.
This review article describes the early research and development response in Israel under the scope of in-hospital patient care, such as non-contact sensing of patients’ vital signs, and how it could potentially be weaved into a practical big picture at the hospital or national level using a strategic management system. At this stage, some of the described technologies are still in developmental or clinical evidence generation phases with respect to COVID-19 settings. While waiting for future publications describing the results of the ongoing evidence generation efforts, one should be aware of this trend as these emerging tools have the potential to further benefit patients as well as caregivers and health care systems beyond the scope of the current pandemic as well as confronting future surges in the number of cases.
Objective: In patients with acute hepatic porphyria (AHP), prolonged fasting is a known trigger of AHP attacks. Despite this, some Jewish AHP patients—mainly hereditary coproporphyria (HCP) and variegate porphyria (VP) patients—fast for 25 consecutive hours during the traditional Jewish holy day known as Yom Kippur. In this study, we evaluated the effect of the fast on these patients.
Methods: A retrospective study and survey of AHP patients in Israel was carried out. Patients were asked whether they have fasted and whether any symptoms were induced by this fast. Patients’ medical records were reviewed for an emergency department (ED) visit following YK between 2007 and 2019. Only 3 acute intermittent porphyria (AIP) patients reported fasting; they were excluded from analysis.
Results: A total of 21 HCP patients and 40 VP patients completed the survey; 30 quiescent patients reported they fast, while 31 did not fast. The majority of fasting patients (96.67%) reported no symptoms following a fast. We found no statistically significant association between ED visits 1 week (0.26% in both fasting and non-fasting patients) or 1 month (2.1% visits in non-fasting versus 0.78% in fasting patients) following Yom Kippur. Of the symptomatic ED visits following a fast, none were defined as severe attacks.
Conclusion: A 25-hour fast in stable HCP and VP patients did not increase the risk of an acute attack and can probably be regarded as safe.
Increasing evidence points towards mitochondria as crucial players in the initiation and progression of auto-immune and degenerative disorders, to which impaired cell metabolism is but a facet of the subjacent etiopathogenesis. This review aims to introduce the reader to essential concepts of mitochondrial abnormalities in idiopathic inflammatory myopathy (IIM), underscoring inclusion-body myositis and dermatomyositis. Far surpassing the initial simplistic view of being responsible for energy generation, mitochondria have gathered attention regarding their role in inflammatory processes, being able to fuel autoimmunity, as shown by the presence of anti-mitochondrial antibodies (AMAs) in up to 10% of IIM patients. As cellular respiration takes place, mitochondrial metabolites might help to shape the pro-inflammatory milieu in affected muscle, beyond generating reactive oxygen species, which are well-recognized inducers of damage-associated molecular patterns. A series of mitochondrial components might facilitate the sterile activation of pro-inflammatory cells and the production of several cytokines responsible for enhancing auto-immune responses. Marked variation in the mitochondrial genome has also been reported in IIM patients. As such, we summarize key historical and recent advances linking aberrations and instabilities of mitochondrial DNA to impaired muscle function. Besides discussing mitochondrial dysfunction as an essential part of IIM development, we also highlight possible associations between presence of AMAs and a particular phenotype of IIM, with its own characteristic clinical and radiological pattern. Finally, we present promising treatment approaches targeting mitochondria, while briefly discussing experimental models for gaining deeper insight into the disease process, and ultimately leading to novel drug development.
Idiopathic inflammatory myopathies (IIM) are a rare group of disorders that feature progressive immune-mediated skeletal muscle destruction along with skin, lung, and joint involvement. Management of IIMs necessitates glucocorticoid therapy followed by conventional steroid-sparing agents to control disease activity. In the settings of refractory myositis or life-threatening manifestations, e.g. lung involvement or oropharyngeal dysphagia, second-line therapies are needed to minimize disease burden, avoid end-organ damage and steroid toxicity, and decrease mortality. These therapies may include biological disease-modifying antirheumatic drugs (bDMARDs), and to a lesser extent, targeted synthetic disease-modifying antirheumatic drugs (TSD). This article reviews the current use of bDMARDs, e.g. intravenous immunoglobulin and rituximab, and a TSD—Janus kinase inhibitors (JAKI)—along with their indications, efficacy, and safety in managing IIM.
Background: There is an increasing body of literature associating edentulism with cognitive impairment. The aim of this systematic review was to summarize the available data, emphasizing the role of removable dental prostheses in preventing cognitive deterioration and promoting brain health in elderly individuals.
Aim: This systematic review investigates the relationship between the use of removable dental prostheses and physiological or adaptive changes at the cerebral level in partially and completely edentulous patients.
Methods: A systematic review was conducted following PRISMA guidelines, with an initial search across PubMed, Scopus, and Web of Science databases. Studies published up to June 2023 in English were considered. A risk of bias assessment was performed for included studies.
Results: Of the 86 studies initially screened, 13 met the inclusion criteria. Findings indicate a positive association between the use of removable dental prostheses and improved cognitive function, with potential therapeutic implications for managing cognitive decline.
Conclusion: Removable dental prostheses play a crucial role in enhancing neurological health and preventing cognitive decline, making them an important consideration in the management of neurodegenerative diseases.
Objectives: Our study aimed to determine the relationship between serum periostin levels, and the neutrophil–lymphocyte ratio (NLR) with ischemic stroke subtypes, clinical stroke scales, and acute prognosis in patients with acute ischemic stroke.
Materials and Methods: Forty-two ischemic stroke patients and 39 age- and sex-matched healthy volunteers were included in our study. Demographic characteristics including age and gender were recorded. Blood serum periostin and NLR values were evaluated in the first 24 hours after admission. Serum periostin levels were compared with healthy controls of similar age and sex. Lesion localization was determined by cranial CT or diffusion MRI of the patients. Stroke scales were recorded on days 1 and 7 of hospitalization in the study group.
Results: The mean serum periostin levels were higher than in the control group, but no statistically significant difference was found. There was no correlation between serum periostin levels and prognosis of stroke. First admission NLRs were statistically higher than in the control group. The first admission NLRs were positively correlated with the first admission National Institute of Health Stroke Scale score and the day 7 modified Rankin score.
Conclusion: Our study is the first study to evaluate both NLR and serum periostin levels in all types of acute ischemic stroke. While our study did not show that first admission serum periostin levels can be used as a biomarker in ischemic stroke, it did indicate that the first admission NLR can be used for acute prognosis of ischemic stroke.
Quantification of the T cell receptor excision circles (TRECs) has recently emerged as a useful non-invasive clinical and research tool to investigate thymic activity. It allows the identification of T cell production by the thymus. Quantification of TREC copies has recently been implemented as the preferred test to screen neonates with severe combined immunodeficiency (SCID) or significant lymphopenia. Neonatal genetic screening for SCID is highly important in countries with high rates of consanguinous marriages, such as Israel, and can be used for early diagnosis, enabling prompt therapeutic intervention that will save lives and improve the outcome of these patients. TREC measurement is also applicable in clinical settings where T cell immunity is involved, including any T cell immunodeficiencies, HIV infection, the aging process, autoimmune diseases, and immune reconstitution after bone marrow transplantation.
TAKE-HOME MESSAGES
• Severe combined immunodeficiency, a life-threatening condition, can be detected by neonatal screening.
• The earlier the detection and the quicker the implementation of appropriate treatment, the greater the likelihood for improved outcome, even cure, for the affected children.
• TRECs and KRECs quantification are useful screening tests for severe T and B cell immunodeficiency and can be used also to evaluate every medical condition involving T and B cell immunity.
Experimental pain stimuli can be used to simulate patients’ pain experience. We review recent developments in psychophysical pain testing, focusing on the application of the dynamic tests—conditioned pain modulation (CPM) and temporal summation (TS). Typically, patients with clinical pain of various types express either less efficient CPM or enhanced TS, or both. These tests can be used in prediction of incidence of acquiring pain and of its intensity, as well as in assisting the correct choice of analgesic agents for individual patients. This can help to shorten the commonly occurring long and frustrating process of adjusting analgesic agents to the individual patients. We propose that evaluating pain modulation can serve as a step forward in individualizing pain medicine.
Effects of early life psychosocial adversity have received a great deal of attention, such as maternal separation in experimental animal models and abuse/neglect in young humans. More recently, long-term effects of the physical stress of repetitive procedural pain have begun to be addressed in infants hospitalized in neonatal intensive care. Preterm infants are more sensitive to pain and stress, which cannot be distinguished in neonates. The focus of this review is clinical studies of long-term effects of repeated procedural pain-related stress in the neonatal intensive care unit (NICU) in relation to brain development, neurodevelopment, programming of stress systems, and later pain sensitivity in infants born very preterm (24–32 weeks’ gestational age). Neonatal pain exposure has been quantified as the number of invasive and/or skin-breaking procedures during hospitalization in the NICU. Emerging studies provide convincing clinical evidence for an adverse impact of neonatal pain/stress in infants at a time of physiological immaturity, rapidly developing brain microstructure and networks, as well as programming of the hypothalamic-pituitary-adrenal axis. Currently it appears that early pain/stress may influence the developing brain and thereby neurodevelopment and stress-sensitive behaviors, particularly in the most immature neonates. However, there is no evidence for greater prevalence of pain syndromes compared to children and adults born healthy at full term. In addressing associations between pain/stress and outcomes, careful consideration of confounding clinical factors related to prematurity is essential. The need for pain management for humanitarian care is widely advocated. Non-pharmacological interventions to help parents reduce their infant’s stress may be brain-protective.