Objectives: Anti-osteoporotic drugs (AOD) are essential for secondary prevention of osteoporotic fracture (OF) in patients with established osteoporosis. However, data about AOD utilization rates are scarce among patients with OF. This study was therefore aimed at determining the AOD utilization rates among those particularly vulnerable patients.
Materials and Methods: This cross-sectional study followed the medical records of patients with OF starting from their first OF diagnosis date. Each patient’s preventive osteoporosis treatments (vitamin D, calcium+vitamin D) and AOD utilization rate were recorded for a 12-month period following OF diagnosis.
Results: A total of 210 patients (168 females, mean age: 67.8±11.9 years; 42 males, mean age 62.4±16.1 years) were enrolled in the study. Of these, 65.7% (n=138) did not use any medication for primary protection against osteoporosis before OF diagnosis. The ratio of patients not using any type of medication for secondary prevention after OF increased from 26.5% to 51% during a 12-month period. In addition, by one year following diagnosis, AOD usage rate had decreased from 62.3% to 41.3%.
Conclusion: The AOD usage rates for secondary prevention of OF were insufficient, and cessation rates were high. Identification of factors associated with decreased AOD utility rates will provide important information for guiding patient follow-up in order to reduce the occurrence of OF.
In the management of malignant thyroid disorders, the standard primary treatment is thyroidectomy, a surgical resection of the thyroid gland. This procedure has been performed for over a century. Hence, it comes as no surprise that it is not only exceedingly well-described in the literature. This issue of Rambam Maimonides Medical Journal includes an article by Chaturvedi et al. that challenges the standard widely-practiced clinical inclination toward surgery as the first and best option for all patients with early thyroid cancer. This editorial discusses the issues raised by the authors and points out the importance of ongoing research to determine when standards of care should be modified in the light of low-risk disease.
Background: Transthyretin (TTR), also known as prealbumin, has been suggested as an indicator of protein and nutritional status.
Objective: The aim of this study was to examine the maternal and umbilical cord (UC) TTR in relation to intrauterine growth, and the serum TTR of preterm infants in relation to nutritional status and growth.
Methods: After application of exclusion criteria, 49 preterm infants (mean gestational age and birth-weight 32.9±2.9 weeks and 1822±556 g) were included in the study. Transthyretin was sampled at birth and on days 14, 28 or at discharge with growth parameters and nutritional laboratories.
Results: Mean UC and maternal TTR were positively correlated (8.5±2.4 mg/dL and 20.4±7.0 mg/dL, r=0.31, P=0.07). Umbilical cord TTR was neither an index of maturity nor of intrauterine growth. Umbilical cord TTR was higher in females (9.4±2.6 versus 7.6±1.8 mg/dL, P=0.015). Maternal TTR was lower in twin pregnancies (16.8±4.9 versus 22.5±7.3 mg/dL, P=0.007). Although TTR levels gradually increased over time in correlation with post-menstrual and chronological ages (r=0.24, P=0.011 and r=0.40, P<0.001, respectively), there was no correlation to weight gain (r=0.10, P=0.41), nutritional status, protein intake, or laboratories. The only significant correlations were between TTR and glucose and triglycerides levels (r=0.51, P<0.001 for both).
Conclusions: Although TTR levels increased over time, we could not demonstrate significant correlations between TTR and indices of the nutritional status in preterm infants at birth or during the neonatal course.
Background: The increasing resistance of many bacterial pathogens against antibiotic measures urgently requires new or repurposing therapeutic strategies. Gentian violet is a triarylmethane dye used as a histological stain and for Gram’s method of classifying bacteria. It also exerts an antimicrobial effect against certain pathogens, especially dermatological infections. Safranin is the most popular counterstain used in medical laboratories due to its low cost and safe laboratory usage. However, few studies have been conducted on the antimicrobial activity of safranin.
Objective: With the growing prevalence of multidrug-resistant bacteria, this study aimed to evaluate the antibacterial efficacy of gentian violet and safranin against multidrug-resistant Staphylococcus aureus (S. aureus) and Pseudomonas aeruginosa (P. aeruginosa).
Methods: All tested bacteria were multidrug-resistant (MDR) bacteria isolated from skin infections (abscesses and wounds). Using gentian violet and safranin, antibacterial effects were studied using the well-diffusion method against 20 samples of clinically isolated bacteria, 10 diagnosed as S. aureus, and 10 as P. aeruginosa. Bacteria were diagnosed using the VITEK 2 automated system (bioMérieux, Marcy-l’Étoile, France). Iodine served as the control agent, since both Gram-positive and Gram-negative bacteria are sensitive to it.
Results: Gentian violet dye has been shown to be 100% sensitive to both Gram-positive and Gram-negative bacterial isolates. Although safranin also had high sensitivity (100%) to S. aureus isolates, its sensitivity to P. aeruginosa was only 20%. Staphylococcus aureus was more resistant to iodine (40% sensitivity) compared to P. aeruginosa, which was 100% sensitive to iodine.
Conclusions: Gentian violet and safranin are low-cost and better tolerated topical agents that have potential for use in dermatological applications. Gentian violet had good antibacterial activity against both Gram-positive and Gram-negative bacteria, making it useful for treating bacterial skin pathogens such as S. aureus and P. aeruginosa especially for MDR bacteria. While safranin has good efficacy against Gram-positive bacteria (S. aureus), its effect against Gram-negative bacteria (e.g. P. aeruginosa) is poor.
External accreditation reviews of undergraduate medical curricula play an important role in their quality assurance. However, these reviews occur only at 4–10-year intervals and are not optimal for the immediate identification of problems related to teaching. Therefore, the Standards of Medical Education in Israel require medical schools to engage in continuous, ongoing monitoring of their teaching programs for
compliance with accreditation standards. In this paper, we propose the following: (1) this monitoring be assigned to independent medical education units (MEUs), rather than to an infrastructure of the dean’s office, and such MEUs to be part of the school governance and draw their authority from university institu¬tions; and (2) the differences in the importance of the accreditation standards be addressed by discerning between the “most important” standards that have been shown to improve student well-being and/or patient health outcomes; “important” standards associated with student learning and/or performance; “possibly important” standards with face validity or conflicting evidence for validity; and “least important” standards that may lead to undesirable consequences. According to this proposal, MEUs will evolve into entities dedicated to ongoing monitoring of the education program for compliance with accreditation standards, with an authority to implement interventions. Hopefully, this will provide MEUs and faculty with the common purpose of meeting accreditation requirements, and an agreed-upon prioritization of accreditation standards will improve their communication and recommendations to faculty.
Food security and nutrition were major drivers of cultural evolution by enabling sociotypic development and communal living after the Neolithic agricultural revolution some 12,000 years ago. The sociotype unites concepts from the sciences and the humanities; in concert with the genotype it determines an individual’s phenotype (observable traits and behavior), and together they advance societal culture. As such, the sociotype relates to an individual’s dynamic interactions with the surrounding social environment through¬out life and comprises three domains: the Individual, Relationships, and Context. Nutrition affects each domain, respectively, by ensuring the following dimensions of food security: utilization (metabolic fuel and health); accessibility (physical and economic); and availability (the right to nutritious food for all citi¬zens). The sociotype is influenced by multiple factors, including diet–gene interactions, allostasis, micro¬biota, oxytocin, and culturally through mate selection, family bonds, social communication, political ideol¬ogies, and values. Food security, sociotypes, and culture form a complex adaptive system to enable coping with the circumstances of life in health and disease, to achieve sustainable development, and to eradicate hunger. The current geopolitical unrest highlights the absolutely critical role of this system for global security, yet many challenges remain in implementing this paradigm for society. Therefore, sustainable food security must be considered a fundamental human right and responsibility for safeguarding the survival and progress of the sociotypes of humankind (Homo cultures) worldwide.
Background and Objective: Postoperative (post-op) pain control has an important impact on post-op rehabilitation. The logistics of its maintenance challenge the effect of peripheral nerve block on post-op pain control, with the risk for post-op complications. We hypothesized that perioperative use of local infiltration analgesia (LIA) is comparable to post-op pain control by peripheral nerve block.
Materials and Methods: We evaluated three groups of patients treated with primary total knee arthroplasty (TKA) due to symptomatic end-stage osteoarthritis with post-op pain control by LIA (LIA group, n=52), femoral plus sciatic nerve block (FSNB) (FSNB group, n=54), and without local or regional analgesia as controls (Control group, n=53). The primary outcome variable was the post-op pain level intensity as measured by the visual analog scale (VAS). Secondary outcome variables were knee function measured by the Knee Society Score (KSS) and the quadriceps muscle strength recovery profile.
Results: Up to 4 hours post-op, pain intensity was significantly lower in FSNB patients (P<0.05). This effect of the peripheral nerve block on the pain level disappeared 6 hours post-op. The LIA and FSNB patients showed a significant decrease in pain intensity on days 2 and 3 post-op (P<0.05) with no mutual differences (P>0.05). This effect disappeared on day 4 post-op (P>0.05). The KSS score showed similar significant improvement of functional abilities (P<0.001) in all three groups. There was no difference in KSS scores among the groups 6 months after surgery (P>0.05). Quadriceps muscle recovery profile was similar in the LIA and Control groups, but significantly poorer in the FSNB group (P<0.001).
Conclusion: The value of very short-term and improved pain relief of post-op FSNB over LIA of the surgical wound should be carefully weighed against its cost, logistics, and potential complication threat.
The time has come for us to work together in a concerted effort to decrease the related suffering and consequences of osteoporotic fractures. And if not now, when?
Introduction: The second wave of coronavirus disease 2019 (COVID-19) led to the resurgence of opportunistic infections due to the injudicious use of steroids. Sinonasal mucormycosis was declared an epidemic in India during the pandemic. Mucormycosis was managed effectively by surgical debridement along with systemic amphotericin B. Currently, a resurgence of mucormycosis following initial treatment, in the form of fungal osteomyelitis of the frontal bone, is being seen in India.
Methods: This prospective study included 10 patients with fungal osteomyelitis of the frontal bone due to mucormycosis. All patients underwent surgical debridement of the sequestrum and involucrum, with systemic antifungal pharmacotherapy.
Results: The average duration of time until mucormycosis recurrence was 22 days following initial treatment (range 10–33 days). Patients presented with extracranial bossing following outer frontal cortex erosion (n=3), bicortical erosion (n=3), bifrontal involvement (n=2), dural involvement (n=3), and involvement of the brain parenchyma and prefrontal cortex (n=2). All cases underwent debridement of the entire sequestrous bone and involucrum until normal bone could be identified. The mean admission duration was 4 weeks (range 3–6 weeks). All treated patients are currently alive and without disease, confirmed by contrast-enhanced computed tomography.
Conclusion: Based on our experience, the successful treatment of fungal osteomyelitis due to mucormycosis requires a four-pronged approach: early detection, multidisciplinary management of comorbidities, surgical debridement of necrotic bone, and adequate systemic antifungal therapy.
Clustered regularly interspaced short palindromic repeats (CRISPR) gene editing is an innovative and potentially game-changing biotechnology that can potentially reverse DNA mutations in a tissue-specific manner. In addition, CRISPR is being targeted for xenotransplantation, for increasing human longevity, in animal breeding, and in plant science. However, there are many ethical challenges that emerge from CRISPR technology. This article discusses several positions that relate to these ethical challenges from a Jewish legal perspective. In addition, we present several other applications of CRISPR technology that lack a defined Jewish legal precedent and require rabbinical scholars to address and resolve them in the future.