This review examines the risk of developing celiac disease (CD) and other autoimmune diseases in individ¬uals receiving the rotavirus (RV) vaccine compared to the normal population. Celiac disease is a malabsorp¬tive, chronic, immune-mediated enteropathy involving the small intestine. The pathogenesis of CD is multifactorial, and mucosal immunity plays an important role in its development. Low mucosal IgA levels significantly increase the risk of developing the disease. Rotavirus is an infectious agent that causes diar¬rhea, particularly in children aged 0–24 months, and is frequently involved in diarrhea-related deaths in these children. An oral vaccine against RV has been developed. While it is effective on RV infection, it also contributes to increasing mucosal immunity. Studies have indicated that individuals immunized with the RV vaccine are at lower risk of developing CD than unvaccinated individuals. In addition, the mean age for developing CD autoimmunity may be higher in the vaccinated group than in controls receiving placebo. Additional studies that include children immunized with different RV vaccines and unvaccinated children would provide more meaningful results. Although current data suggest a possible association of RV vaccina¬tion with a reduced risk of developing CD and other autoimmune diseases, this remains an unanswered question that merits greater international investigation.
The Jews in Western Europe during the middle ages were often perceived as distinct from other people not only in their religion, but also by virtue of peculiar physical characteristics. Male Jews were circumcised, which made them physically distinct in the sexual realm. They were believed to have a flux of blood due to hemorrhoids that was thought to more abound in Jews because they consumed salty foods and gross undigested blood, and were melancholic. By the late medieval and early modern periods, the male menstru¬ation motif had become closely connected to the theory of the four humors and the balance between bodily fluids. Men in general were thought of as emitting extra heat, whereas women were considered to be phys¬ically cooler. While most men were generally able to reduce their heat naturally, there was a perception that womanish Jewish males were unable to do so, and thereby required “menstruation” (i.e. a literal discharge of blood) in order to achieve bodily equilibrium. The Jewish male image as having menses due to bleeding hemorrhoids was an anti-Semitic claim that had a religious explanation: Jews menstruated because they had been beaten in their hindquarters for having crucified Jesus Christ. This reflection is one of the first biological-racial motifs that were used by the Christians. Preceding this, anti-Semitic rationalizations were mostly religious. However, once these Christians mixed anti-Semitism with science, by emphasizing the metaphorical moral impurity of Jews, the subsequent belief that Jewish men “menstruated” developed—a belief that would have dire historical consequences for the Jewish communities of Europe until even the mid-twentieth century. This topic has direct applicability to current medical practice. The anti-Semitic perspec¬tive of Jewish male menstruation would never have taken hold if the medical community had not ignored the facts, and if the population in general had had a knowledge of the facts. In the same way, it is important for present-day scientists and healthcare professionals to understand thoroughly a topic and not to deliberately ignore the facts, which can affect professional and public thought, thereby leading to incorrect and at times immoral conclusions.
Background: Hyperinsulinemia and insulin resistance occurs in obese patients with primary hypertension independent of diabetes and obesity. This study was aimed at assessing serum fasting insulin levels, the homeostatic model assessment for insulin resistance (HOMA-IR), and serum lipid levels in non-obese patients with primary hypertension when compared to normotensive subjects.
Methods: This observational study comprised 100 patients over 18 years of age, divided into two groups. The hypertensive group comprised non-obese patients with primary hypertension (n=50); the normotensive group comprised normotensive age- and sex-matched individuals (n=50). Patients with diabetes, impaired fasting glucose, obesity, and other causative factors of insulin resistance were excluded from the study. Serum fasting insulin levels and fasting lipid profiles were measured, and insulin resistance was calculated using HOMA-IR. These data were compared between the two groups. Pearson’s correlation coefficient was used to assess the extent of a linear relationship between HOMA-IR and to evaluate the association between HOMA-IR and systolic and diastolic blood pressures.
Results: Mean serum fasting insulin levels (mIU/L), mean HOMA-IR values, and fasting triglyceride levels (mg/dL) were significantly higher in the hypertensive versus normotensive patients (10.32 versus 6.46, P<0.001; 1.35 versus 0.84, P<0.001; 113.70 versus 97.04, P=0.005, respectively). The HOMA-IR levels were associated with systolic blood pressure (r value 0.764, P=0.0005).
Conclusion: We observed significantly higher fasting insulin levels, serum triglyceride levels, and HOMA-IR reflecting hyperinsulinemia and possibly an insulin-resistant state among primary hypertension patients with no other causally linked factors for insulin resistance. We observed a significant correlation between systolic blood pressure and HOMA-IR.
Background: Early thyroid cancers have excellent long-term outcomes, yet the word “cancer” draws unnecessary apprehension. This study aimed to define when the recommendations for observation and surveillance may be extended to early thyroid cancers at the population level.
Methods: Non-metastasized thyroid cancers ≤40 mm diameter were identified from the 1975–2016 Surveillance, Epidemiology and End Results (SEER) database. Causes of death were compared across demographic data. Disease-specific outcomes were compared to the age-adjusted healthy United States (US) population. Survival estimates were computed using Kaplan–Meier and compared using the Cox proportional hazard model. Dynamic benchmarks impacting disease-specific overall survival were determined by decision tree modeling and tested by the Cox model.
Results: Of the 28,728 thyroid cancers included in this study, 98.4% underwent some form of thyroid-specific treatment and were followed for a maximum of 10.9 years. This group had a 4.3% mortality rate at the end of follow-up (10.9 years maximum), with 13 times more deaths attributed to competing risks rather than thyroid cancer (stage T1a versus stage T1b, P=1.000; T1 versus T2, P<0.001). Among the untreated T1a or T1b tumors, the risk of disease-specific death was 21 times lower than death due to other causes. There was no significant difference between T1a and T1b tumors nor across sex. The age-adjusted risk of death for the healthy US population was higher than the population with thyroid cancer. Dynamic categorization demonstrated worsening outcomes up to 73 years, uninfluenced by sex or tumor size. For patients over 73 years of age, only tumors >26 mm impacted outcomes.
Conclusion: Based on the current data, T1a and T1b nodules have similar survival outcomes and are not significantly impacted even when left untreated. Multi-institutional prospective studies are needed to confirm these findings so that current observation and surveillance recommendations can be extended to certain T1 thyroid nodules.
Objectives: Anti-osteoporotic drugs (AOD) are essential for secondary prevention of osteoporotic fracture (OF) in patients with established osteoporosis. However, data about AOD utilization rates are scarce among patients with OF. This study was therefore aimed at determining the AOD utilization rates among those particularly vulnerable patients.
Materials and Methods: This cross-sectional study followed the medical records of patients with OF starting from their first OF diagnosis date. Each patient’s preventive osteoporosis treatments (vitamin D, calcium+vitamin D) and AOD utilization rate were recorded for a 12-month period following OF diagnosis.
Results: A total of 210 patients (168 females, mean age: 67.8±11.9 years; 42 males, mean age 62.4±16.1 years) were enrolled in the study. Of these, 65.7% (n=138) did not use any medication for primary protection against osteoporosis before OF diagnosis. The ratio of patients not using any type of medication for secondary prevention after OF increased from 26.5% to 51% during a 12-month period. In addition, by one year following diagnosis, AOD usage rate had decreased from 62.3% to 41.3%.
Conclusion: The AOD usage rates for secondary prevention of OF were insufficient, and cessation rates were high. Identification of factors associated with decreased AOD utility rates will provide important information for guiding patient follow-up in order to reduce the occurrence of OF.
External accreditation reviews of undergraduate medical curricula play an important role in their quality assurance. However, these reviews occur only at 4–10-year intervals and are not optimal for the immediate identification of problems related to teaching. Therefore, the Standards of Medical Education in Israel require medical schools to engage in continuous, ongoing monitoring of their teaching programs for
compliance with accreditation standards. In this paper, we propose the following: (1) this monitoring be assigned to independent medical education units (MEUs), rather than to an infrastructure of the dean’s office, and such MEUs to be part of the school governance and draw their authority from university institu¬tions; and (2) the differences in the importance of the accreditation standards be addressed by discerning between the “most important” standards that have been shown to improve student well-being and/or patient health outcomes; “important” standards associated with student learning and/or performance; “possibly important” standards with face validity or conflicting evidence for validity; and “least important” standards that may lead to undesirable consequences. According to this proposal, MEUs will evolve into entities dedicated to ongoing monitoring of the education program for compliance with accreditation standards, with an authority to implement interventions. Hopefully, this will provide MEUs and faculty with the common purpose of meeting accreditation requirements, and an agreed-upon prioritization of accreditation standards will improve their communication and recommendations to faculty.
Background and Objective: Medical cannabis is becoming an acceptable treatment modality in medicine, especially for pain relief. Concurrently, cannabis use is becoming more prevalent worldwide, a public demand-driven trend despite the lack of established scientific basis. This observational open-label study sought to investigate the effectiveness of cannabis therapy for alleviating low back pain symptoms.
Methods: Two types of cannabis treatment modalities were sequentially administered to chronic low back pain patients. After an initial 1-month washout period (WO1), the first modality was cannabidiol (CBD)-rich sublingual extract treatment administered for 10 months. Following another washout period, the second modality, Δ⁹-tetrahydrocannabinol (THC)-rich smoked inflorescence (whole dried cannabis flowers) was administered for 12 months.
Results: Enrolled in the study were 24 patients whose advanced imaging studies (i.e. computerized tomography or magnetic resonance imaging of the lumbar spine) revealed disc herniation or spinal stenosis. Three patients dropped out of extract therapy treatment but resumed study participation to receive THC-rich smoking therapy. After a minimum of 2 years, cannabis therapy had reduced lower back pain symptoms, as assessed by Oswestry Disability Index, the SF-12 patient-reported outcome questionnaire, and the visual analogue scale. Pain reduction was not significant during the extract treatment part of the study; however, pain reduction was significant during the inhaled therapy part of the study.
Conclusions: Our findings indicate that inhaled THC-rich therapy is more effective than CBD-rich sublingual extract therapy for treating low back pain and that cannabis therapy is safe and effective for chronic low back pain.
Clustered regularly interspaced short palindromic repeats (CRISPR) gene editing is an innovative and potentially game-changing biotechnology that can potentially reverse DNA mutations in a tissue-specific manner. In addition, CRISPR is being targeted for xenotransplantation, for increasing human longevity, in animal breeding, and in plant science. However, there are many ethical challenges that emerge from CRISPR technology. This article discusses several positions that relate to these ethical challenges from a Jewish legal perspective. In addition, we present several other applications of CRISPR technology that lack a defined Jewish legal precedent and require rabbinical scholars to address and resolve them in the future.
Background: Blunt traumatic brain injury (bTBI) and uncontrolled hemorrhagic shock (UCHS) are common causes of mortality in polytrauma. We studied the influence of fresh frozen plasma (FFP) resuscitation in a rat model with both bTBI and UCHS before achieving hemorrhage control.
Methods: The bTBI was induced by an external weight drop (200 g) onto the bare skull of anesthetized male Lewis (Lew/SdNHsd) rats; UCHS was induced by resection of two-thirds of the rats’ tails. Fifteen minutes following trauma, bTBI+UCHS rats underwent resuscitation with FFP or lactated Ringer’s solution (LR). Eight groups were evaluated: (1) Sham; (2) bTBI; (3) UCHS; (4) UCHS+FFP; (5) UCHS+LR; (6) bTBI+UCHS; (7) bTBI+UCHS+FFP; and (8) bTBI+UCHS+LR. Bleeding volume, hematocrit, lactate, mean arterial pressure (MAP), heart rate, and mortality were measured.
Results: The study included 97 rats that survived the immediate trauma. Mean blood loss up to the start of resuscitation was similar among UCHS only and bTBI+UCHS rats (P=0.361). Following resuscitation, bleeding was more extensive in bTBI+UCHS+FFP rats (5.2 mL, 95% confidence interval [CI] 3.7, 6.6) than in bTBI+UCHS+LR rats (2.5 mL, 95% CI 1.2, 3.8) and bTBI+UCHS rats (1.9 mL, 95% CI -0.2, 3.9) (P=0.005). Similarly, non-significant increases in blood loss were observed in UCHS+FFP rats (P=0.254). Overall mortality increased if bleeding was above 4.5 mL (92.3% versus 8%; P<0.001). Mortality was 83.3% (10/12) in bTBI+UCHS+FFP rats, 41.7% (5/12) in bTBI+UCHS+LR rats, and 64.3% (9/14) in bTBI+UCHS rats.
Conclusion: The bTBI did not exacerbate bleeding in rats undergoing UCHS. Compared to LR, FFP resuscitation was associated with a significantly increased blood loss in bTBI+UCHS rats.
There is a rich history of surgery for cardiac arrhythmias, spanning from atrial fibrillation and Wolff–Parkinson–White syndrome to inappropriate sinus tachycardia and ventricular tachycardia. This review describes the history of these operations, their evolution over time, and the current state of practice. We devote considerable time to the discussion of atrial fibrillation, the most common cardiac arrhythmia addressed by surgeons. We discuss ablation of atrial fibrillation as a stand-alone operation and as a concomitant operation performed at the time of cardiac surgery. We also discuss the emergence of newer procedures to address atrial fibrillation in the past decade, such as the convergent procedure and totally thoracoscopic ablation, and their outcomes relative to historic approaches such as the Cox maze procedure.