The coagulation system constitutes an important facet of the unique vascular microenvironment in which primary and metastatic brain tumors evolve and progress. While brain tumor cells express tissue factor (TF) and other effectors of the coagulation system (coagulome), their propensity to induce local and peripheral thrombosis is highly diverse, most dramatic in the case of glioblastoma multiforme (GBM), and less obvious in pediatric tumors. While the immediate medical needs often frame the discussion on current clinical challenges, the coagulation pathway may contribute to brain tumor progression through subtle, context-dependent, and non-coagulant effects such as induction of inflammation, angiogenesis, or by responding to iatrogenic insults (e.g. surgery). In this regard, the emerging molecular diversity of brain tumor suptypes (e.g. in glioma and medulloblastoma) highlights the link between oncogenic pathways and the tumor repertoire of coagulation system regulators (coagulome). This relationship may influence the mechanisms of spontaneous and therapeutically provoked tumor cell interactions with the coagulation system as a whole. Indeed, oncogenes (EGFR, MET) and tumor suppressors (PTEN, TP53) may alter the expression, activity, and vesicular release of tissue factor (TF), and cause other changes. Conversely, the coagulant microenvironment may also influence the molecular evolution of brain tumor cells through selective and instructive cues. We suggest that effective targeting of the coagulation system in brain tumors should be explored through molecular stratification, stage-specific analysis, and more personalized approaches including thromboprophylaxis and adjuvant treatment aimed at improvement of patient survival.
Achievement of complete response (CR) to therapy in chronic lymphocytic leukemia (CLL) has become a feasible goal, directly correlating with prolonged survival. It has been established that the classic definition of CR actually encompasses a variety of disease loads, and more sensitive multiparameter flow cytometry and polymerase chain reaction methods can detect the disease burden with a much higher sensitivity. Detection of malignant cells with a sensitivity of 1 tumor cell in 10,000 cells (10–4), using the above-mentioned sophisticated techniques, is the current cutoff for minimal residual disease (MRD). Tumor burdens lower than 10–4 are defined as MRD-negative. Several studies in CLL have determined the achievement of MRD negativity as an independent favorable prognostic factor, leading to prolonged disease-free and overall survival, regardless of the treatment protocol or the presence of other pre-existing prognostic indicators. Minimal residual disease evaluation using flow cytometry is a sensitive and applicable approach which is expected to become an integral part of future prospective trials in CLL designed to assess the role of MRD surveillance in treatment tailoring.
During the past 50 years, a dramatic reduction in the mortality rate associated with cardiovascular disease has occurred in the US and other countries. Statistical modeling has revealed that approximately half of this reduction is the result of risk factor mitigation. The successful identification of such risk factors was pioneered and has continued with the Framingham Heart Study, which began in 1949 as a project of the US National Heart Institute (now part of the National Heart, Lung, and Blood Institute). Decreases in total cholesterol, blood pressure, smoking, and physical inactivity account for 24%, 20%, 12%, and 5% reductions in the mortality rate, respectively. Nephrology was designated as a recognized medical professional specialty a few years later. Hemodialysis was first performed in 1943. The US Medicare End-Stage Renal Disease (ESRD) Program was established in 1972. The number of patients in the program increased from 5,000 in the first year to more than 500,000 in recent years. Only recently have efforts for risk factor identification, early diagnosis, and prevention of chronic kidney disease (CKD) been undertaken. By applying the approach of the Framingham Heart Study to address CKD risk factors, we hope to mirror the success of cardiology; we aim to prevent progression to ESRD and to avoid the cardiovascular complications associated with CKD. In this paper, we present conceptual examples of risk factor modification for CKD, in the setting of this historical framework.
Objective: The World Health Organization’s (WHO) guidelines for cancer pain management were intentionally made simple in order to be widely implemented by all physicians treating cancer patients. Referrals to pain specialists are advised if pain does not improve within a short time. The present study examined whether or not a reasonable use of the WHO guideline was made by non-pain specialists prior to referral of patients with cancer-related pain to a pain clinic.
Methods: Cancer patients referred to a pain specialist completed several questionnaires including demographics, medical history, and cancer-related pain; the short-form McGill Pain Questionnaire (SF-MPQ); and the Short Form Health Survey SF-12. Data from referral letters and medical records were obtained. Treatments recommended by pain specialists were recorded and categorized as “unjustified” if they were within the WHO ladder framework, or “justified” if they included additional treatments.
Results: Seventy-three patients (44 women, 29 men) aged 55 years (range, 25–85) participated in the study. Their pain lasted for a mean of 6 (1–192) months. Mean pain intensity scores on a 0–10 numerical rating scale were 7 (2–10) at rest and 8 (3–10) upon movement. Most patients complied with their referring physician’s recommendations and consumed opioids. Adverse events were frequent. No significant correlation was found between the WHO analgesic medication step used and mean pain levels reported. There were 63 patient referrals (85%) categorized as “unjustified,” whereas only 11 patients (15%) required “justified” interventions.
Conclusions: These findings imply that analgesic treatment within the WHO framework was not reasonably utilized by non-pain specialists before referring patients to pain clinics.
Anti-citrullinated protein antibodies (ACPAs) are highly specific serologic markers for rheumatoid arthritis (RA) and can pre-date clinical disease onset by up to 10 years, also predicting erosive disease. The process of citrullination, the post-translational conversion of arginine to citrulline residues, is mediated by peptidylarginine deiminase (PAD) enzymes present in polymorphonuclear cells (PMNs). Calcium ions (Ca2+) are required for PAD activation, but the intracellular Ca2+ concentration in normal cells is much lower than the optimal Ca2+ concentration needed for PAD activation. For this reason, it has been proposed that PAD activation, and thus citrullination, occurs only during PMN cell death when PAD enzymes leak out of the cells into the extracellular matrix, or extracellular Ca2+ enters the cells, with the high Ca2+ concentration activating PAD. Recently, using artificial in vitro systems to corroborate their hypothesis, Romero et al. demonstrated that “hypercitrullination,” citrullination of multiple intracellular proteins, occurs within synovial fluid (SF) cells of RA patients, and that only modes of death leading to membranolysis such as perforin-granzyme pathway or complement membrane attack complex activation cause hypercitrullination. In order for Romero’s hypothesis to hold, it is reasonable to surmise that PMN-directed lysis should occur in the rheumatoid joint or the circulation of RA patients. Research conducted thus far has shown that immunoglobulin G (IgG) targeting PMNs are present in RA SF and mediate PMN activation. However, the role of anti-PMN IgG in mediating complement activation and subsequent PMN lysis and hypercitrullination has not been fully evaluated.
Objective: To examine the relationship between duration of fetal hypoxia, nucleated red blood cell (NRBC) count, and fetal growth.
Methods: Pregnant rats were exposed to a severe hypoxia (9.5%–10% O2) for varying time intervals (2, 6, 12, 24, 48, and 120 hours; n=4 for each time interval) immediately prior to delivery at term. Normoxic controls were exposed to room air (21% O2) and matched for all other study variables (n=4 rats for each time interval). Pups were delivered via hysterotomy while maintaining exposure gas concentrations. Blood gas analysis and NRBC counts were performed, and fetal body and liver weights were recorded. Student’s t test and simple regression were used for statistical analysis.
Results: As the duration of hypoxia increased, fetal weight, liver weight, blood bicarbonate, and base excess levels decreased significantly; concomitantly, NRBC counts increased. This increase in NRBCs became statistically significant after 24 hours of exposure. After 48 hours of hypoxia there was a 2.5-fold rise in NRBC count, and after 120 hours of hypoxia there was a 4.5-fold rise in NRBC count over control levels. After 12 or more hours of hypoxia, fetal body weights were significantly reduced; 120 hours of hypoxia resulted in a 35% reduction in fetal body weight, a 34% reduction in fetal liver weight, and 356% increase in NRBC count.
Conclusion: In a pregnant rat model, chronic maternal hypoxia (≥24 hours) results in a significant increase in fetal NRBC counts as well as reduced fetal body weight and organ growth.
Objectives: Research and theory suggests that socioeconomic status may affect diabetes control. We investigated the effect of socioeconomic status and ethnicity on glycated hemoglobin (HbA1c) in Arab and Jewish children with type 1 diabetes mellitus in northern Israel.
Methods: Data were collected from medical records of 80 Arab and 119 Jewish children attending a pediatric diabetes clinic in a tertiary health care center. Multivariate regression analysis was used to assess factors independently affecting HbA1c level.
Results: Mean age was 12.9±4.7 years. Arab families had more children compared to Jewish families (3.7±1.5 versus 2.9±1.2, respectively, P=0.0007). Academic education was significantly less common in Arab families (25% versus 66.2%, respectively, P=0.0001). Income of Jewish parents was significantly higher compared to that of Arab parents (7,868±2,018 versus 5,129±906 NIS/month, respectively, P=0.0001). Mean age at diagnosis of diabetes was 9.6±4.6 years and disease duration was 3.4±2.3 years in both groups. Half of Arab and Jewish children were treated with multiple insulin injections and half with insulin pumps. Mean number of self-glucose testing/day was higher in Jewish children than in Arab children (4.7±2.5 versus 4.0±1.5, respectively, P=0.033). Mean HbA1c was above recommendations, 9.5% (12.6 mmol/L) in Arab children and 8.7% (11.3 mmol/L) in Jewish children (P=0.004). In multivariate analysis, disease duration (P=0.010) and ethnicity (P=0.034 for Arabs versus Jews) were independently associated with HbA1c.
Conclusions: Both Arab and Jewish children failed to meet HbA1c goals, but this effect was significantly greater for Arabs. Ethnicity remained a predictor of failure even following adjustment for potential confounders.
Background: Early thyroid cancers have excellent long-term outcomes, yet the word “cancer” draws unnecessary apprehension. This study aimed to define when the recommendations for observation and surveillance may be extended to early thyroid cancers at the population level.
Methods: Non-metastasized thyroid cancers ≤40 mm diameter were identified from the 1975–2016 Surveillance, Epidemiology and End Results (SEER) database. Causes of death were compared across demographic data. Disease-specific outcomes were compared to the age-adjusted healthy United States (US) population. Survival estimates were computed using Kaplan–Meier and compared using the Cox proportional hazard model. Dynamic benchmarks impacting disease-specific overall survival were determined by decision tree modeling and tested by the Cox model.
Results: Of the 28,728 thyroid cancers included in this study, 98.4% underwent some form of thyroid-specific treatment and were followed for a maximum of 10.9 years. This group had a 4.3% mortality rate at the end of follow-up (10.9 years maximum), with 13 times more deaths attributed to competing risks rather than thyroid cancer (stage T1a versus stage T1b, P=1.000; T1 versus T2, P<0.001). Among the untreated T1a or T1b tumors, the risk of disease-specific death was 21 times lower than death due to other causes. There was no significant difference between T1a and T1b tumors nor across sex. The age-adjusted risk of death for the healthy US population was higher than the population with thyroid cancer. Dynamic categorization demonstrated worsening outcomes up to 73 years, uninfluenced by sex or tumor size. For patients over 73 years of age, only tumors >26 mm impacted outcomes.
Conclusion: Based on the current data, T1a and T1b nodules have similar survival outcomes and are not significantly impacted even when left untreated. Multi-institutional prospective studies are needed to confirm these findings so that current observation and surveillance recommendations can be extended to certain T1 thyroid nodules.
Alternating hemiplegia of childhood (AHC) is a complex neurodevelopmental disorder characterized by paroxysmal and transient events of unilateral or bilateral paresis, usually occurring before 18 months of age. Mutations in the ATP1A3 gene, mainly p.Asp801Asn, p.Glu815Lys, and p.Gly947Arg at the protein level, are found in around 80% of the individuals with AHC. Interestingly, these mutations reflect the degree of severity of the neurological symptoms (p.Glu815Lys > p.Asp801Asn > p.Gly947Arg). Some channels involved in this disorder are N-type voltage-gated calcium channels, ATP-sensitive potassium channels, and the sodium/calcium exchanger. In this context, the management of AHC should be divided into the treatment of attacks, prophylactic treatment, and management of comorbidities commonly found in this group of individuals, including epilepsy, attention-deficit/hyperactivity disorder, aggressive behavior, cognitive impairment, movement disorders, and migraine. The importance of an integrated approach with a multidisciplinary team, such as neuropsychologists and dietitians, is worth mentioning, as well as the follow-up with a neurologist. In the present study, we propose new diagnostic criteria for AHC, dividing it into clinical, laboratory, supporting, and atypical features. Also, we review the location of the mutations in the ATP1A3 protein of individuals with AHC, rapid-onset dystonia-parkinsonism (RDP) variants, and early infantile epileptic encephalopathy (variants with hemiplegic attack). We also include a section about the animal models for ATP1A3 disorders.
Objective: To compare the results of treating patients with common bile duct (CBD) stones by endoscopic sphincterotomy (ES), surgical exploration, or a combination of ES and surgical CBD exploration (the rendezvous technique).
Methods: A narrative review of the literature.
Summary of Data: Before 1990, 17 cohort studies indicated that ES cleared CBD stones in 92.0% of patients, with a mortality rate of 1.5%. Surgery removed CBD stones in 90.2% of patients, with a 2.1% mortality rate. A single randomized controlled trial in 1987 showed that ES removed CBD stones in 91% of 55 patients, with a 3.6% mortality rate and a 27% complication rate, whereas surgical CBD exploration removed CBD stones in 92%, with a 1.8% mortality rate and a 22% complication rate. Since 1991, 26 randomized controlled trials have shown that laparoscopic–ES rendezvous is as effective as ES alone and laparoscopic surgery alone but is associated with fewer complications, a reduced need for additional procedures, and a shorter hospital stay.
Conclusions: A laparoscopic–ES rendezvous appears to be the optimal approach to the treatment of CBD stones in younger and fit patients. The choice between ES alone and laparoscopic–ES rendezvous in older or high-risk patients remains uncertain.