In the early seventeenth century, the Jews formally established two separate communities in Amsterdam, the Portuguese Sephardi and the High German Ashkenazi congregations. Until the end of the eighteenth century, medical care for the Amsterdam indigent Jews had been controlled and regulated by the powerful Parnasim, the de facto rulers, of each community. The primary communal organizations that were exclu¬sively responsible for medical care for the poor were the Bikur Holim societies. This approach for the care of the indigent Jewish sick became ineffective in the nineteenth century and was replaced by a hospital-based system. This essay describes how seriously ill indigent Jews in nineteenth-century Amsterdam received hospital care, tracing the establishment and development of the first Ashkenazi and Sephardi hospitals in the city. Although each community established their own hospital, they used different approaches to accomplish this goal.
Background: The importance of emotional intelligence (EI) to the success of health professionals has been increasingly acknowledged. Concurrently, medical schools have begun integrating non-cognitive measures in candidate selection processes. The question remains whether these newly added processes correctly assess EI skills.
Objectives: Measuring EI levels among medical students; examining the correlations between participants’ EI levels and their scores on the non-cognitive MOR test; and exploring students’ attitudes regarding the importance of EI in medical practice.
Methods: The study included 111 first-year and sixth-year students at the Faculty of Medicine at the Technion, Haifa, Israel. Emotional intelligence was assessed by the Bar-On EQ-i 2.0, and MOR evaluation scores were provided by the faculty. An additional questionnaire was designed to rate students’ attitudes toward the importance of EI to the success of medical doctors (MDs).
Results: No significant correlations were found between MOR test scores and EI evaluation scores. Of the 15 EI competencies evaluated, mean scores for flexibility, problem-solving, and independence were lowest for both the first-year and the sixth-year study groups. No differences in EI levels between first-year and sixth-year students were found. Both groups of students considered EI to be highly important to their success as MDs.
Conclusions: While further studies of the links between MOR tests and EI are required, the current findings indicate that MOR test scores may not be predictive of medical students’ EI levels and vice versa. As previous evidence suggests that EI contributes to professional success and to better outcomes in the field of medicine, integrating it into selection processes for medical students and into the curricula in medical schools is recommended.
Background: The use of electric bicycles (E-bikes) has dramatically increased over the last decade. E-bikes offer an inexpensive, alternative form of transport, but also pose a new public health challenge in terms of safety and injury prevention.
Objective: The aim of this study was to describe the epidemiology and severity of E-bike related injuries among children treated in the emergency department (ED) and to compare these to manual bicycle related injuries.
Methods: A retrospective observational study of all pediatric patients presenting to the ED between December 2014 and November 2015 with an injury related to E-bike or manual bicycle use. Data including demographics, diagnosis, injury severity score (ISS), and outcome were compared.
Results: A total of 196 cyclist injuries presented to the ED; 85 related to E-bike use and 111 to manual bicycle riders. The mean age of E-bikers was 13.7 years (7.5–16 years) and of manual bicycle riders was 9.9 years (3–16 years). Injuries to the head and the extremities were common in both groups. E-bikers had significantly more intra-abdominal organ injury (P=0.047). Injury severity scores were low overall, but injuries of higher severity (ISS>9) only occurred among the E-bikers.
Conclusions: Pediatric E-bike injuries tend to be more severe than those sustained during manual bicycle riding. Further research into bicycle and other road and pavement users could lead to enhanced regulation regarding E-bike usage.
United States (US) and European Union (EU) laws attempt to counterbalance the presumed discrimination of children in drug treatment and drug development. The US Food and Drug Administration (FDA)-rewarded pediatric studies with antidepressants triggered in 2004 an FDA black-box warning of suicidality in young patients. Fewer antidepressants were prescribed, and the number of completed suicides of young persons increased. The dilemma between this warning and the need to adequately treat young depressed patients remains unsolved. We analyzed the history of drug development, the evolving view of diseases in young patients, US/EU pediatric laws, and pediatric studies triggered by FDA/European Medicines Agency (EMA) in depression and other diseases on the background of developmental pharmacology; financial, institutional, and other interests; and the literature. The FDA/EMA define children administratively, not physio¬logically, as <17 (FDA)/<18 years old (EMA). But young persons mature physiologically well before their 17th/18th birthday. Depression occurs in young persons, has special characteristics, but is not fundamentally different from adult depression. Young persons are not another species. Regulatory requirements for “pediatric” studies focus on “pediatric” labels. Many “pediatric” studies, including those in depression, lacked and lack medical sense and harm patients by placebo treatment although effective drugs exist. The FDA has partially abandoned separate “pediatric” efficacy studies, but not in psychiatry. Clinicians, parents, institutional review boards, and ethics committees should become aware of questionable “pediatric” studies, should re-evaluate ongoing ones, consider to suspend them, and to reject new ones. The concept of separate “pediatric” drug approval needs to be abandoned.
Objective. To compare the reported accuracy and sensitivity of the various modalities used to diagnose autism spectrum disorders (ASD) in efforts to help focus further biomarker research on the most promising methods for early diagnosis.
Methods. The Medline scientific literature database was searched to identify publications assessing potential clinical ASD biomarkers. Reports were categorized by the modality used to assess the putative markers, including protein, genetic, metabolic, or objective imaging methods. The reported sensitivity, specificity, area under the curve, and overall agreement were summarized and analyzed to determine weighted averages for each diagnostic modality. Heterogeneity was measured using the I2 test.
Results. Of the 71 papers included in this analysis, each belonging to one of five modalities, protein-based followed by metabolite-based markers provided the highest diagnostic accuracy, each with a pooled overall agreement of 83.3% and respective weighted area under the curve (AUC) of 89.5% and 88.3%. Sensitivity provided by protein markers was highest (85.5%), while metabolic (85.9%) and protein markers (84.7%) had the highest specificity. Other modalities showed degrees of sensitivity, specificity, and overall agree¬ments in the range of 73%–80%.
Conclusions. Each modality provided for diagnostic accuracy and specificity similar or slightly higher than those reported for the gold-standard Autism Diagnostic Observation Schedule (ADOS) instrument. Further studies are required to identify the most predictive markers within each modality and to evaluate biological pathways or clustering with possible etiological relevance. Analyses will also be necessary to determine the potential of these novel biomarkers in diagnosing pediatric patients, thereby enabling early intervention.
Objectives: We hypothesized that preoperative (pre-op) ultrasound (US)-guided posterior transversus abdominis plane block (TAP) and US-guided ilioinguinal and iliohypogastric nerve block (ILI+IHG) will produce a comparable analgesia after Lichtenstein patch tension-free method of open inguinal hernia repair in adult men. The genital branch of the genitofemoral nerve will be blocked separately.
Methods: This is a prospective, randomized, controlled, and observer-blinded clinical study. A total of 166 adult men were randomly assigned to one of three groups: a pre-op TAP group, a pre-op ILI+IHG group, and a control group. An intraoperative block of the genital branch of the genitofemoral nerve was performed in all patients in all three groups, followed by postoperative patient-controlled intravenous analgesia with morphine. The pain intensity and morphine consumption immediately after surgery and during the 24 hours after surgery were compared between the groups.
Results: A total of 149 patients completed the study protocol. The intensity of pain immediately after surgery and morphine consumption were similar in the two “block” groups; however, they were significantly decreased compared with the control group. During the 24 hours after surgery, morphine consumption in the ILI+IHG group decreased compared with the TAP group, as well as in each “block” group versus the control group. Twenty-four hours after surgery, all evaluated parameters were similar.
Conclusion: Ultrasound-guided ILI+IHG provided better pain control than US-guided posterior TAP following the Lichtenstein patch tension-free method of open inguinal hernia repair in men during 24 hours after surgery. (ClinicalTrials.gov number: NCT01429480.)
Introduction: Completion thyroidectomy is defined as the surgical removal of the remnant thyroid tissue following procedures of less than total or near-total thyroidectomy. Whether thyroid reoperations are associated with an increased complication risk is controversial.
Objective: A retrospective analysis was done of patients undergoing completion thyroidectomy for cancer of the thyroid who had undergone surgery elsewhere for solitary thyroid nodule. The incidence of surgical complications in these patients after reoperation was investigated in this study.
Material and Methods: The study included a total of 53 patients who had undergone thyroid lobectomy for a solitary nodule as initial surgery elsewhere and were referred to our institute for completion thyroidectomy when the histopathology revealed malignancy.
Results: There were 53 patients, 43 females and 10 males. Their mean age was 34.7±12.12 years (range 19–65 years). After initial surgery, the histopathology revealed papillary carcinoma in 46 patients (86.8%), follicular carcinoma in 7 (13.2%). Fourteen out of 53 patients had recurrent laryngeal nerve palsy after initial surgery (26.4%). None of the patients had clinical hypocalcemia after the first surgery. One or more parathyroid glands were identified and preserved in 52 patients (98.1%) in the process of completion thyroidectomy. No patient had additional recurrent nerve injury at the second surgery. The mean serum calcium value preoperatively was 8.96±0.39 mg/dL, and six months after surgery serum calcium was 8.74±0.56 mg/dL. Mean follow-up was 18 months. Transient hypoparathyroidism occurred in 24.5% patients. Five patients were lost to follow-up. Permanent and symptomatic hyperparathyroidism occurred in eight patients (16.67%).
Conclusions: Completion thyroidectomy is a safe and appropriate option in the management of well-differentiated thyroid cancer. It removes disease on the ipsilateral and contralateral side of the thyroid and carries a low risk of recurrent laryngeal nerve damage, but a higher risk of permanent hypoparathyroidism.
Anti-citrullinated protein antibodies (ACPAs) are highly specific serologic markers for rheumatoid arthritis (RA) and can pre-date clinical disease onset by up to 10 years, also predicting erosive disease. The process of citrullination, the post-translational conversion of arginine to citrulline residues, is mediated by peptidylarginine deiminase (PAD) enzymes present in polymorphonuclear cells (PMNs). Calcium ions (Ca2+) are required for PAD activation, but the intracellular Ca2+ concentration in normal cells is much lower than the optimal Ca2+ concentration needed for PAD activation. For this reason, it has been proposed that PAD activation, and thus citrullination, occurs only during PMN cell death when PAD enzymes leak out of the cells into the extracellular matrix, or extracellular Ca2+ enters the cells, with the high Ca2+ concentration activating PAD. Recently, using artificial in vitro systems to corroborate their hypothesis, Romero et al. demonstrated that “hypercitrullination,” citrullination of multiple intracellular proteins, occurs within synovial fluid (SF) cells of RA patients, and that only modes of death leading to membranolysis such as perforin-granzyme pathway or complement membrane attack complex activation cause hypercitrullination. In order for Romero’s hypothesis to hold, it is reasonable to surmise that PMN-directed lysis should occur in the rheumatoid joint or the circulation of RA patients. Research conducted thus far has shown that immunoglobulin G (IgG) targeting PMNs are present in RA SF and mediate PMN activation. However, the role of anti-PMN IgG in mediating complement activation and subsequent PMN lysis and hypercitrullination has not been fully evaluated.
Anti-citrullinated protein autoantibodies (ACPAs) are the major autoantibodies in rheumatoid arthritis (RA). Anti-citrullinated protein autoantibodies are directed against different citrullinated antigens, including filaggrin, fibrinogen, vimentin, and collagen. Presence of ACPA is associated with joint damage and extra-articular manifestations, suggesting that ACPAs are most likely pathogenic autoantibodies in RA. In vitro, ACPAs induce macrophage tumor necrosis factor alpha (TNF-α) production, osteoclastogenesis, and complement activation. These autoantibodies also induce the formation of neutrophil extracellular traps (NETs). Additionally, ACPAs induce pathogenic cytokines expression and oxidative stress in immune cells derived from RA patients. The aim of this review is to show the pathogenic roles of these autoantibodies in RA.
Background: The PROSPECT (Procedure-Specific Postoperative Pain Management) Group recommended a single injection femoral nerve block in 2008 as a guideline for analgesia after total knee arthroplasty. Other authors have recommended the addition of sciatic and obturator nerve blocks. The lateral femoral cutaneous nerve is also involved in pain syndrome following total knee arthroplasty. We hypothesized that preoperative blocking of all four nerves would offer superior analgesia to femoral nerve block alone.
Methods: This is a prospective, randomized, controlled, and observer-blinded clinical study. A total of 107 patients were randomly assigned to one of three groups: a femoral nerve block group, a multiple nerve block group, and a control group. All patients were treated postoperatively using patient-controlled intravenous analgesia with morphine. Pain intensity at rest, during flexion and extension, and morphine consumption were compared between groups over three days.
Results: A total of 90 patients completed the study protocol. Patients who received multiple nerve blocks experienced superior analgesia and had reduced morphine consumption during the postoperative period compared to the other two groups. Pain intensity during flexion was significantly lower in the “blocks” groups versus the control group. Morphine consumption was significantly higher in the control group.
Conclusions: Pain relief after total knee arthroplasty immediately after surgery and on the first postoperative day was significantly superior in patients who received multiple blocks preoperatively, with morphine consumption significantly lower during this period. A preoperative femoral nerve block alone produced partial and insufficient analgesia immediately after surgery and on the first postoperative day. (Clinical trial registration number (NIH): NCT01303120)