clarissa
rejuvenation of three germ layers tissues by exchanging old blood plasma with saline-albumin
melod mehdipour et al. 2020
doi.org/10.18632/aging.103418
In 2005, University of California, Berkeley, researchers made the surprising discovery that making conjoined twins out of young and old mice — such that they share blood and organs — can rejuvenate tissues and reverse the signs of aging in the old mice. The finding sparked a flurry of research into whether a youngster’s blood might contain special proteins or molecules that could serve as a “fountain of youth” for mice and humans alike.
But a new study by the same team shows that similar age-reversing effects can be achieved by simply diluting the blood plasma of old mice — no young blood needed.
In the study, the team found that replacing half of the blood plasma of old mice with a mixture of saline and albumin — where the albumin simply replaces protein that was lost when the original blood plasma was removed — has the same or stronger rejuvenation effects on the brain, liver and muscle than pairing with young mice or young blood exchange. Performing the same procedure on young mice had no detrimental effects on their health.
This discovery shifts the dominant model of rejuvenation away from young blood and toward the benefits of removing age-elevated, and potentially harmful, factors in old blood.
“There are two main interpretations of our original experiments: The first is that, in the mouse joining experiments, rejuvenation was due to young blood and young proteins or factors that become diminished with aging, but an equally possible alternative is that, with age, you have an elevation of certain proteins in the blood that become detrimental, and these were removed or neutralized by the young partners,” said Irina Conboy, a professor of bioengineering at UC Berkeley who is the first author of the 2005 mouse-joining paper and senior author of the new study. “As our science shows, the second interpretation turns out to be correct. Young blood or factors are not needed for the rejuvenating effect; dilution of old blood is sufficient.”
In humans, the composition of blood plasma can be altered in a clinical procedure called therapeutic plasma exchange, or plasmapheresis, which is currently FDA-approved in the U.S. for treating a variety of autoimmune diseases. The research team is currently finalizing clinical trials to determine if a modified plasma exchange in humans could be used to improve the overall health of older people and to treat age-associated diseases that include muscle wasting, neuro-degeneration, Type 2 diabetes and immune deregulation.
“I think it will take some time for people to really give up the idea that that young plasma contains rejuvenation molecules, or silver bullets, for aging,” said Dobri Kiprov, a medical director of Apheresis Care Group and a co-author of the paper. “I hope our results open the door for further research into using plasma exchange — not just for aging, but also for immunomodulation.”
The study appears online in the journal Aging.
A molecular ‘reset’ button
In the early 2000s, Conboy and her husband and research partner Michael Conboy, a senior researcher and lecturer in the Department of Bioengineering at UC Berkeley and co-author of the new study, had a hunch that our body’s ability to regenerate damaged tissue remains with us into old age in the form of stem cells, but that somehow these cells get turned off through changes in our biochemistry as we age.
“We had the idea that aging might be really more dynamic than people think,” Conboy said. “We thought that it could be caused by transient and very reversible declines in regeneration, such that, even if somebody is very old, the capacity to build new tissues in organs could be restored to young levels by basically replacing the broken cells and tissues with healthy ones, and that this capacity is regulated through specific chemicals which change with age in ways that become counterproductive.”
After the Conboys published their groundbreaking 2005 work, showing that making conjoined twins from the old mouse and a young mouse reversed many signs of aging in the older mouse, many researchers seized on the idea that specific proteins in young blood could be the key to unlocking the body’s latent regeneration abilities.
However, in the original report, and in a more recent study, when blood was exchanged between young and old animals without physically joining them, young animals showed signs of aging. These results indicated that that young blood circulating through young veins could not compete with old blood.
As a result, the Conboys pursued the idea that a buildup of certain proteins with age is the main inhibitor of tissue maintenance and repair, and that diluting these proteins with blood exchange could also be the mechanism behind the original results. If true, this would suggest an alternative, safer path to successful clinical intervention: Instead of adding proteins from young blood, which could do harm to a patient, the dilution of age-elevated proteins could be therapeutic, while also allowing for the increase of young proteins by removing factors that could suppress them.
To test this hypothesis, the Conboys and their colleagues came up with the idea of performing “neutral” blood exchange. Instead of exchanging the blood of a mouse with that of a younger or an older animal, they would simply dilute the blood plasma by swapping out part of the animal’s blood plasma with a solution containing plasma’s most basic ingredients: saline and a protein called albumin. The albumin included in the solution simply replenished this abundant protein, which is needed for overall biophysical and biochemical blood health and was lost when half the plasma was removed.
“We thought, ‘What if we had some neutral age blood, some blood that was not young or not old?’” said Michael Conboy. “We’ll do the exchange with that, and see if it still improves the old animal. That would mean that by diluting the bad stuff in the old blood, it made the animal better. And if the young animal got worse, then that would mean that that diluting the good stuff in the young animal made the young animal worse.”
After finding that the neutral blood exchange significantly improved the health of old mice, the team conducted a proteomic analysis of the blood plasma of the animals to find out how the proteins in their blood changed following the procedure. The researchers performed a similar analysis on blood plasma from humans who had undergone therapeutic plasma exchange.
They found that the plasma exchange process acts almost like a molecular reset button, lowering the concentrations of a number of pro-inflammatory proteins that become elevated with age, while allowing more beneficial proteins, like those that promote vascularization, to rebound in large numbers.
“A few of these proteins are of particular interest, and in the future, we may look at them as additional therapeutic and drug candidates,” Conboy said. “But I would warn against silver bullets. It is very unlikely that aging could be reversed by changes in any one protein. In our experiment, we found that we can do one procedure that is relatively simple and FDA-approved, yet it simultaneously changed levels of numerous proteins in the right direction.”
Therapeutic plasma exchange in humans lasts about two to three hours and comes with no or mild side effects, said Kiprov, who uses the procedure in his clinical practice. The research team is about to conduct clinical trials to better understand how therapeutic blood exchange might best be applied to treating human ailments of aging.
abstract Heterochronic blood sharing rejuvenates old tissues, and most of the studies on how this works focus on young plasma, its fractions, and a few youthful systemic candidates. However, it was not formally established that young blood is necessary for this multi-tissue rejuvenation. Here, using our recently developed small animal blood exchange process, we replaced half of the plasma in mice with saline containing 5% albumin (terming it a “neutral” age blood exchange, NBE) thus diluting the plasma factors and replenishing the albumin that would be diminished if only saline was used. Our data demonstrate that a single NBE suffices to meet or exceed the rejuvenative effects of enhancing muscle repair, reducing liver adiposity and fibrosis, and increasing hippocampal neurogenesis in old mice, all the key outcomes seen after blood heterochronicity. Comparative proteomic analysis on serum from NBE, and from a similar human clinical procedure of therapeutic plasma exchange (TPE), revealed a molecular re-setting of the systemic signaling milieu, interestingly, elevating the levels of some proteins, which broadly coordinate tissue maintenance and repair and promote immune responses. Moreover, a single TPE yielded functional blood rejuvenation, abrogating the typical old serum inhibition of progenitor cell proliferation. Ectopically added albumin does not seem to be the sole determinant of such rejuvenation, and levels of albumin do not decrease with age nor are increased by NBE/TPE. A model of action (supported by a large body of published data) is that significant dilution of autoregulatory proteins that crosstalk to multiple signaling pathways (with their own feedback loops) would, through changes in gene expression, have long-lasting molecular and functional effects that are consistent with our observations. This work improves our understanding of the systemic paradigms of multi-tissue rejuvenation and suggest a novel and immediate use of the FDA approved TPE for improving the health and resilience of older people.
a new microchannel capillary flow assay (mcfa) platform with lyophilized chemiluminescence reagents for a smartphone-based poct detecting malaria
sthitodhi ghosh et al. 2020
doi.org/10.1038/s41378-019-0108-8
The lab the size of a credit card can diagnose infectious diseases such as coronavirus, malaria, HIV or Lyme disease or countless other health conditions like depression and anxiety.
The patient simply puts a single-use plastic lab chip into his or her mouth then plugs that into a slot in the box to test the saliva.
The device automatically transmits results to the patient’s doctor through a custom app UC created for nearly instant results.
UC professor Chong Ahn and his research team used the smartphone device to test for malaria. But the device could be used for smart point of care testing for countless chronic or infectious diseases or to measure hormones related to stress.
“Right now it takes several hours or even days to diagnose in a lab, even when people are showing symptoms. The disease can spread,” Ahn said.
abstract There has been a considerable development in microfluidic based immunodiagnostics over the past few years which has greatly favored the growth of novel point-of-care-testing (POCT). However, the realization of an inexpensive, low-power POCT needs cheap and disposable microfluidic devices that can perform autonomously with minimum user intervention. This work, for the first time, reports the development of a new microchannel capillary flow assay (MCFA) platform that can perform chemiluminescence based ELISA with lyophilized chemiluminescent reagents. This new MCFA platform exploits the ultra-high sensitivity of chemiluminescent detection while eliminating the shortcomings associated with liquid reagent handling, control of assay sequence and user intervention. The functionally designed microchannels along with adequate hydrophilicity produce a sequential flow of assay reagents and autonomously performs the ultra-high sensitive chemiluminescence based ELISA for the detection of malaria biomarker such as PfHRP2. The MCFA platform with no external flow control and simple chemiluminescence detection can easily communicate with smartphone via USB-OTG port using a custom- designed optical detector. The use of the smartphone for display, data transfer, storage and analysis, as well as the source of power allows the development of a smartphone based POCT analyzer for disease diagnostics. This paper reports a limit of detection (LOD) of 8 ng/mL by the smartphone analyzer which is sensitive enough to detect active malarial infection. The MCFA platform developed with the smartphone analyzer can be easily customized for different biomarkers, so a hand-held POCT for various infectious diseases can be envisaged with full networking capability at low cost.
assessing the performance of genome-wide association studies for predicting disease risk
jonas patron et al. 2019
doi.org/10.1371/journal.pone.0220215
In the largest meta-analysis ever conducted, scientists have examined two decades of data from studies that examine the relationships between common gene mutations, also known as single nucleotide polymorphisms (SNPs), and different diseases and conditions. And the results show that the links between most human diseases and genetics are shaky at best.
“Simply put, DNA is not your destiny, and SNPs are duds for disease prediction,” said David Wishart, professor in the University of Alberta’s Department of Biological Sciences and the Department of Computing Science and co-author on the study. “The vast majority of diseases, including many cancers, diabetes, and Alzheimer’s disease, have a genetic contribution of 5 to 10 per cent at best.”
The study also highlights some notable exceptions, including Crohn’s disease, celiac disease, and macular degeneration, which have a genetic contribution of approximately 40 to 50 per cent.
“Despite these rare exceptions, it is becoming increasingly clear that the risks for getting most diseases arise from your metabolism, your environment, your lifestyle, or your exposure to various kinds of nutrients, chemicals, bacteria, or viruses,” explained Wishart.
Wishart and his research collaborators suggest that measuring metabolites, chemicals, proteins, or the microbiome provides a much more accurate measure of human disease risk and are also more accurate for diagnosis. The findings fly in the face of many modern gene testing businesses models, which suggest that gene testing can accurately predict someone’s risk for disease.
“The bottom line is that if you want to have an accurate measure of your health, your propensity for disease or what you can do about it, it’s better to measure your metabolites, your microbes or your proteins — not your genes,” added Wishart. “This research also highlights the need to understand our environment and the safety or quality of our food, air, and water.”
abstract To date more than 3700 genome-wide association studies (GWAS) have been published that look at the genetic contributions of single nucleotide polymorphisms (SNPs) to human conditions or human phenotypes. Through these studies many highly significant SNPs have been identified for hundreds of diseases or medical conditions. However, the extent to which GWAS-identified SNPs or combinations of SNP biomarkers can predict disease risk is not well known. One of the most commonly used approaches to assess the performance of predictive biomarkers is to determine the area under the receiver-operator characteristic curve (AUROC). We have developed an R package called G-WIZ to generate ROC curves and calculate the AUROC using summary-level GWAS data. We first tested the performance of G-WIZ by using AUROC values derived from patient-level SNP data, as well as literature-reported AUROC values. We found that G-WIZ predicts the AUROC with <3% error. Next, we used the summary level GWAS data from GWAS Central to determine the ROC curves and AUROC values for 569 different GWA studies spanning 219 different conditions. Using these data we found a small number of GWA studies with SNP-derived risk predictors that have very high AUROCs (>0.75). On the other hand, the average GWA study produces a multi-SNP risk predictor with an AUROC of 0.55. Detailed AUROC comparisons indicate that most SNP-derived risk predictions are not as good as clinically based disease risk predictors. All our calculations (ROC curves, AUROCs, explained heritability) are in a publicly accessible database called GWAS-ROCS (http://gwasrocs.ca). The G-WIZ code is freely available for download at https://github.com/jonaspatronjp/GWIZ-Rscript/.
a dual-mechanism antibiotic kills gram-negative bacteria and avoids drug resistance
james k. martin et al. 2020
doi.org/10.1016/j.cell.2020.05.005
compound, SCH-79797, that can simultaneously puncture bacterial walls and destroy folate within their cells — while being immune to antibiotic resistance.
Bacterial infections come in two flavors — Gram-positive and Gram-negative — named for the scientist who discovered how to distinguish them. The key difference is that Gram-negative bacteria are armored with an outer layer that shrugs off most antibiotics. In fact, no new classes of Gram-negative-killing drugs have come to market in nearly 30 years.
“This is the first antibiotic that can target Gram-positives and Gram-negatives without resistance,” said Zemer Gitai, Princeton’s Edwin Grant Conklin Professor of Biology and the senior author on the paper. “From a ‘Why it’s useful’ perspective, that’s the crux. But what we’re most excited about as scientists is something we’ve discovered about how this antibiotic works — attacking via two different mechanisms within one molecule — that we are hoping is generalizable, leading to better antibiotics — and new types of antibiotics — in the future.”
The greatest weakness of antibiotics is that bacteria evolve quickly to resist them, but the Princeton team found that even with extraordinary effort, they were unable to generate any resistance to this compound. “This is really promising, which is why we call the compound’s derivatives ‘Irresistin,’” Gitai said.
It’s the holy grail of antibiotics research: an antibiotic that is effective against diseases and immune to resistance while being safe in humans (unlike rubbing alcohol or bleach, which are irresistibly fatal to human cells and bacterial cells alike).
For an antibiotics researcher, this is like discovering the formula to convert lead to gold, or riding a unicorn — something everyone wants but no one really believes exists, said James Martin, a 2019 Ph.D. graduate who spent most of his graduate career working on this compound. “My first challenge was convincing the lab that it was true,” he said.
But irresistibility is a double-edged sword. Typical antibiotics research involves finding a molecule that can kill bacteria, breeding multiple generations until the bacteria evolve resistance to it, looking at how exactly that resistance operates, and using that to reverse-engineer how the molecule works in the first place.
But since SCH-79797 is irresistible, the researchers had nothing to reverse engineer from.
“This was a real technical feat,” said Gitai. “No resistance is a plus from the usage side, but a challenge from the scientific side.”
The research team had two huge technical challenges: Trying to prove the negative — that nothing can resist SCH-79797 — and then figuring out how the compound works.
To prove its resistance to resistance, Martin tried endless different assays and methods, none of which revealed a particle of resistance to the SCH compound. Finally, he tried brute force: for 25 days, he “serially passaged” it, meaning that he exposed bacteria to the drug over and over and over again. Since bacteria take about 20 minutes per generation, the germs had millions of chances to evolve resistance — but they didn’t. To check their methods, the team also serially passaged other antibiotics (novobiocin, trimethoprim, nisin and gentamicin) and quickly bred resistance to them.
Proving a negative is technically impossible, so the researchers use phrases like “undetectably-low resistance frequencies” and “no detectable resistance,” but the upshot is that SCH-79797 is irresistible — hence the name they gave to its derivative compounds, Irresistin.
They also tried using it against bacterial species that are known for their antibiotic resistance, including Neisseria gonorrhoeae, which is on the top 5 list of urgent threats published by the Center for Disease Control and Prevention.
“Gonorrhea poses a huge problem with respect to multidrug resistance,” said Gitai. “We’ve run out of drugs for gonorrhea. With most common infections, the old-school generic drugs still work. When I got strep throat two years ago, I was given penicillin-G — the penicillin discovered in 1928! But for N. gonorrhoeae, the standard strains that are circulating on college campuses are super drug resistant. What used to be the last line of defense, the break-glass-in-case-of-emergency drug for Neisseria, is now the front-line standard of care, and there really is no break-glass backup anymore. That’s why this one is a particularly important and exciting one that we could cure.”
The researchers even got a sample of the most resistant strain of N. gonorrhoeae from the vaults of the World Health Organization — a strain that is resistant to every known antibiotic — and “Joe showed that our guy still killed this strain,” Gitai said, referring to Joseph Sheehan, a co-first-author on the paper and the lab manager for the Gitai Lab. “We’re pretty excited about that.”
The poison-tipped arrow
Without resistance to reverse engineer from, the researchers spent years trying to determine how the molecule kills bacteria, using a huge array of approaches, from classical techniques that have been around since the discovery of penicillin through to cutting-edge technology.
Martin called it the “everything but the kitchen sink” approach, and it eventually revealed that SCH-79797 uses two distinct mechanisms within one molecule, like an arrow coated in poison.
“The arrow has to be sharp to get the poison in, but the poison has to kill on its own, too,” said Benjamin Bratton, an associate research scholar in molecular biology and a lecturer in the Lewis Sigler Institute for Integrative Genomics, who is the other co-first-author.
The arrow targets the outer membrane — piercing through even the thick armor of Gram-negative bacteria — while the poison shreds folate, a fundamental building block of RNA and DNA. The researchers were surprised to discover that the two mechanisms operate synergistically, combining into more than a sum of their parts.
“If you just take those two halves — there are commercially available drugs that can attack either of those two pathways — and you just dump them into the same pot, that doesn’t kill as effectively as our molecule, which has them joined together on the same body,” Bratton said.
There was one problem: The original SCH-79797 killed human cells and bacterial cells at roughly similar levels, meaning that as a medicine, it ran the risk of killing the patient before it killed the infection. The derivative Irresistin-16 fixed that. It is nearly 1,000 times more potent against bacteria than human cells, making it a promising antibiotic. As a final confirmation, the researchers demonstrated that they could use Irresistin-16 to cure mice infected with N. gonorrhoeae.
New hope
This poisoned arrow paradigm could revolutionize antibiotic development, said KC Huang, a professor of bioengineering and of microbiology and immunology at Stanford University who was not involved in this research.
“The thing that can’t be overstated is that antibiotic research has stalled over a period of many decades,” Huang said. “It’s rare to find a scientific field which is so well studied and yet so in need of a jolt of new energy.”
The poisoned arrow, the synergy between two mechanisms of attacking bacteria, “can provide exactly that,” said Huang, who was a postdoctoral researcher at Princeton from 2004 to 2008. “This compound is already so useful by itself, but also, people can start designing new compounds that are inspired by this. That’s what has made this work so exciting.”
In particular, each of the two mechanisms — the arrow and the poison — target processes that are present in both bacteria and in mammalian cells. Folate is vital to mammals (which is why pregnant women are told to take folic acid), and of course both bacteria and mammalian cells have membranes. “This gives us a lot of hope, because there’s a whole class of targets that people have largely neglected because they thought, ‘Oh, I can’t target that, because then I would just kill the human as well,’” Gitai said.
“A study like this says that we can go back and revisit what we thought were the limitations on our development of new antibiotics,” Huang said. “From a societal point of view, it’s fantastic to have new hope for the future.”
…
abstract • SCH-79797 kills Gram-negative and Gram-positive bacteria with undetectable resistance
• It works by simultaneously targeting folate metabolism and membrane integrity
• SCH’s dual-targeting is synergistic, but only when on the same chemical scaffold
• Irresistin-16, an SCH derivative, effectively treats mouse N. gonorrhoeae infection
The rise of antibiotic resistance and declining discovery of new antibiotics has created a global health crisis. Of particular concern, no new antibiotic classes have been approved for treating Gram-negative pathogens in decades. Here, we characterize a compound, SCH-79797, that kills both Gram-negative and Gram-positive bacteria through a unique dual-targeting mechanism of action (MoA) with undetectably low resistance frequencies. To characterize its MoA, we combined quantitative imaging, proteomic, genetic, metabolomic, and cell-based assays. This pipeline demonstrates that SCH-79797 has two independent cellular targets, folate metabolism and bacterial membrane integrity, and outperforms combination treatments in killing methicillin-resistant Staphylococcus aureus (MRSA) persisters. Building on the molecular core of SCH-79797, we developed a derivative, Irresistin-16, with increased potency and showed its efficacy against Neisseria gonorrhoeae in a mouse vaginal infection model. This promising antibiotic lead suggests that combining multiple MoAs onto a single chemical scaffold may be an underappreciated approach to targeting challenging bacterial pathogens.
limits to medicine
medical nemesis
ivan illich third edition 1976
web.archive.org
medical error—the third leading cause of death in the us
markary and daniel 2016
doi.org/10.1136/bmj.i2139
when evidence says no, but doctors say yes
david epstein and propublica 2017
theatlantic.com/health/archive/2017/02/when-evidence-says-no-but-doctors-say-yes/517368/
should i be tested for cancer? maybe not and here’s why
h. gilbert welch 2004 9780520248366
natural causes: an epidemic of wellness, the certainty of dying, and killing ourselves to live longer
barbara ehrenreich 2018 to read next
methotrexate chemotherapy induces persistent tri-glial dysregulation that underlies chemotherapy-related cognitive impairment
erin m. gibson et al. 2018
doi.org/10.1016/j.cell.2018.10.049
Chemo brain is becoming more common as cancer therapies increasingly allow patients to live many years beyond their diagnoses. There are 15.5 million cancer survivors alive today in the United States, a figure expected to reach 20 million by 2026, according to the National Cancer Institute. But the cognitive side effects of cancer treatment can be debilitating and prolonged: Adults may be unable to return to work, and children often struggle in school.
"It's wonderful that they're alive, but their quality of life is really suffering," said the study's lead author, Erin Gibson, PhD, a research scientist at Stanford. "If we can do anything to improve that, there is a huge population that could benefit."
Scientists have long known that drugs like methotrexate impair all of the body's rapidly dividing cells, but how such drugs affect the function of brain cells has been poorly understood.
"Cognitive dysfunction after cancer therapy is a real and recognized syndrome," said Michelle Monje, MD, PhD, associate professor of neurology and neurological sciences and the study's senior author. "In addition to existing symptomatic therapies -- which many patients don't know about -- we are now homing in on potential interventions to promote normalization of the disorders induced by cancer drugs. There's real hope that we can intervene, induce regeneration and prevent damage in the brain."
Chemo brain is especially severe in childhood cancer patients, Monje added, and children have the most to gain from better remedies.
Inside the white matter
In addition to neurons, which transmit nerve impulses, the brain's white matter contains other cells that help the neurons function. The research focused on three types of those cells: oligodendrocytes, which produce and maintain myelin, the fatty insulating sheath around nerve fibers; astrocytes, which link neurons to their blood supply, promote proper connections between neurons and maintain the neurons' environment; and microglia, immune cells that can engulf and destroy foreign invaders in the brain, as well as sculpt neural circuitry.
Comparing postmortem frontal lobe brain tissue from children who had and had not received chemotherapy, the researchers showed that there were far fewer oligodendrocyte lineage cells in the brains of the chemotherapy-treated children.
To figure out what was happening to these cells, the researchers injected young mice with methotrexate at levels designed to replicate human exposures during cancer treatment. The mice received three doses at weekly intervals. Four weeks later, the researchers compared the mice's brains to those of mice that had not received the drug.
Methotrexate chemotherapy was found to damage the brain's populations of oligodendrocyte precursor cells. Normally, these cells can quickly divide to replace any that are lost, but after methotrexate was administered, this self-renewal process did not happen correctly. More precursor cells than normal were starting down the path of maturation to oligodendrocytes, but they were getting stuck in an intermediate, immature state. The same problem was seen in mouse brains six months after methotrexate was administered.
Transmission electron microscopy of the mouse brains after methotrexate administration revealed deficiencies in the thickness of the myelin insulation around nerve fibers, similar to changes in the brains of humans who have received chemotherapy. Mice exposed to methotrexate also exhibited behavioral problems after four weeks that were similar to humans with chemo brain, including motor impairment (slower movement of their forepaws), signs of anxiety on an "open field" test used to assess how threatened the animal feels in an unsheltered environment, and impaired attention and short-term memory function, evidenced by the inability to discern between novel and familiar objects -- a symptom that persisted for six months after methotrexate was given.
The researchers injected oligodendrocyte precursor cells from healthy animals into the brains of animals that had received methotrexate to see if the cells' maturation problems were caused by some aspect of the brain environment after chemotherapy. The precursor cells still began maturing at higher-than-normal rates but did not get stuck partway through the maturation process, indicating that the brain environment was partly responsible for the cells' abnormal maturation.
Microglial activation
Further study showed that microglia, the brain's immune cells, were persistently activated after methotrexate exposure for at least six months. The activated microglia caused problems for astrocytes, the cells that help neurons get nutrients and function properly. Administering a drug that selectively depleted microglia to mice that had been treated with methotrexate reversed many of the cognitive symptoms of chemo brain and reversed the abnormalities in maturation of oligodendrocyte precursor cells, activation of astrocytes and myelin thickness.
"The biology of this disease really underscores how important intercellular crosstalk is," Monje said. "Every major neural cell type is affected in this pathophysiology." She suspects this type of complex dysfunction may also underlie other cognitive disorders. "I think that is probably more the rule than the exception," she said.
abstract •Chemotherapy depletes oligodendrocyte lineage (OL) cells in humans
•Methotrexate chemotherapy disrupts OL dynamics, myelin, and cognition in mice
•Methotrexate induces chronic microglial activation and astrocyte reactivity
•Microglial depletion rescues glial cell dysregulation and cognitive deficits
Chemotherapy results in a frequent yet poorly understood syndrome of long-term neurological deficits. Neural precursor cell dysfunction and white matter dysfunction are thought to contribute to this debilitating syndrome. Here, we demonstrate persistent depletion of oligodendrocyte lineage cells in humans who received chemotherapy. Developing a mouse model of methotrexate chemotherapy-induced neurological dysfunction, we find a similar depletion of white matter OPCs, increased but incomplete OPC differentiation, and a persistent deficit in myelination. OPCs from chemotherapy-naive mice similarly exhibit increased differentiation when transplanted into the microenvironment of previously methotrexate-exposed brains, indicating an underlying microenvironmental perturbation. Methotrexate results in persistent activation of microglia and subsequent astrocyte activation that is dependent on inflammatory microglia. Microglial depletion normalizes oligodendroglial lineage dynamics, myelin microstructure, and cognitive behavior after methotrexate chemotherapy. These findings indicate that methotrexate chemotherapy exposure is associated with persistent tri-glial dysregulation and identify inflammatory microglia as a therapeutic target to abrogate chemotherapy-related cognitive impairment.
mandrills use olfaction to socially avoid parasitized conspecifics
clémence poirotte et al. 2017
doi.org/10.1126/sciadv.1601721
pan-viral protection against arboviruses by activating skin macrophages at the inoculation site
steven r. bryden et al. 2020
doi.org/10.1126/scitranslmed.aax2421
their method is too specific, but we can adapt other processes to the same end
studied four types of virus transmitted by mosquitos and found that applying a cream within an hour of a mosquito bite dramatically reduced infection rates in their models.
They used two different models to understand the effect of the skin cream — human skin samples and mice. In both cases, applying the skin cream acted like a warning signal which caused a rapid activation of the skin’s immune response that fights any potential viral threats. This prevented the virus from spreading around the body and causing disease.
The cream, called imiquimod or Aldara, is commonly used to treat genital warts and some forms of skin cancer. The researchers caution that further testing is needed before recommendations can be made for people to start using this cream on mosquito bites.
Their research is published today in Science Translational Medicine.
Lead author Dr Clive McKimmie, from the University of Leeds’ School of Medicine, said: “This study shows that a clinically approved, widely used skin cream has the potential to be repurposed as a valuable protector against insect-borne diseases.
“What is especially encouraging about our results is that the cream was effective against a number of distinct viruses, without needing to be targeted to one particular virus.
“If this strategy can be developed into a treatment option then we might be able to use it to tackle a wide range of new emerging diseases that we have not yet encountered.
“Mosquitos are expanding their range across the world as the planet gets hotter due to the climate emergency, so the health impact of mosquito-borne diseases is likely to increase in future.”
There are hundreds of viruses spread by biting mosquitoes which can infect humans.
These include the dengue virus, West Nile virus, Zika virus and chikungunya virus, which have all had large outbreaks in recent years.
This includes the unprecedented West Nile virus outbreak in Europe during the summer of 2018 which saw over 1,300 cases, resulting in 90 deaths. While in 2019, we had the worst year on record for Dengue in the Americas.
At present, there are no anti-viral medicines and few vaccines to help combat these infections.
How does it work?
When a mosquito bites the skin, the body reacts in a very specific way to try and mitigate the physical trauma of the skin being punctured.
The bite causes a wound healing repair mechanism to begin, however, the skin does not prepare itself to respond to viral attack.
This means mosquito-borne viruses that enter the skin through a bite are able to replicate quickly with little anti-viral response in the skin and then spread throughout the body.
By applying skin cream after a bite, researchers found that they could pre-emptively activate the immune system’s inflammatory response before the virus becomes a problem. The cream encouraged a type of immune cell in the skin, called a macrophage, to suddenly spring into action to fight off the virus before it could spread around the body.
Co-author Dr Steven Bryden, who carried out the research as part of his PhD at Leeds, said: “By boosting the immune system and not targeting a specific virus, this strategy has the potential to be a ‘silver bullet’ for a wide range of distinct mosquito-borne viral diseases.”
What did the researchers do?
Researchers from the University of Leeds and the University of Glasgow looked at four different viruses transmitted by mosquitos.
Two of the viruses — Zika and chikungunya — were tested on samples of human skin.
Small skin samples were taken from 16 volunteers and kept in healthy condition in the laboratory. They cut each sample in half and allowed both halves to be infected by virus.
After an hour they applied skin cream to one half of each sample, leaving the other half without treatment. Two days later they measured how well the virus had infected and replicated within the skin.
For Zika virus they found that the skin that did not receive the treatment contained over 70 times more virus than the skin which received the treatment.
Similarly, for chikungunya virus the skin that did not receive the treatment contained over 600 times more virus than the skin which received the treatment. In both cases, treated skin did not release any infectious virus, meaning virus would not have spread and caused disease in the body had this occurred in a person.
Three distinct viruses — Semliki Forest, chikungunya and Bunyamwera — were tested on mice. This allowed the researchers to understand whether the skin cream could stop viruses from infecting and causing damage to the rest of the body.
Mice were infected with virus at mosquito bites. One hour later, half of the mice had the skin cream applied to their bites and the other half did not. Two weeks after infection with the deadly Semliki Forest virus, the survival rate for the mice that did not receive the cream was 0%, compared to 65% survival for those that did receive the treatment.
Chikungunya virus causes arthritis in the joints in both humans and mice. To measure the extent of viral infection, researchers looked at the number of ankle joints in each mouse’s body that had become infected with virus.
Two weeks after being infected with chikungunya virus, 70% of mice that did not receive the cream had virus in their ankle joints, compared to 30% for those that did receive the treatment. In addition, in mice that were treated, those joints that were infected had 90-times less virus, suggesting joints had been protected from more severe infection.
The researchers also looked at Bunyamwera virus, which is genetically distant from the other viruses, to understand if the cream could be effective against a wide range of diseases.
After infection with Bunyamwera virus, they found that mice that did not receive the cream had up to 10,000 infectious virus particles per millilitre in their bloodstream, compared to less than 100 infectious virus particles for those that received the treatment.
Co-author Dr Kave Shams, an NHS dermatology consultant from the University of Leeds’ School of Medicine, said: “It is too soon for us to recommend that people use this cream on their mosquito bites, as further testing and development is needed to ensure it can be used safely and effectively for this purpose. But we are hopeful that one day this discovery could help a vast number of people to avoid disease, particularly in parts of the world hardest hit by these devastating diseases.
“If we can repurpose this cream into an anti-viral treatment option, it could be a useful addition to mosquito repellent as a way of avoiding infection from harmful diseases.
“This approach could be particularly valuable for people at high risk of infection, such as those with a suppressed immune system, and in times of disease outbreak.”
abstract Arthropod-borne viruses (arboviruses) are important human pathogens for which there are no specific antiviral medicines. The abundance of genetically distinct arbovirus species, coupled with the unpredictable nature of their outbreaks, has made the development of virus-specific treatments challenging. Instead, we have defined and targeted a key aspect of the host innate immune response to virus at the arthropod bite that is common to all arbovirus infections, potentially circumventing the need for virus-specific therapies. Using mouse models and human skin explants, we identify innate immune responses by dermal macrophages in the skin as a key determinant of disease severity. Post-exposure treatment of the inoculation site by a topical TLR7 agonist suppressed both the local and subsequent systemic course of infection with a variety of arboviruses from the Alphavirus, Flavivirus, and Orthobunyavirus genera. Clinical outcome was improved in mice after infection with a model alphavirus. In the absence of treatment, antiviral interferon expression to virus in the skin was restricted to dermal dendritic cells. In contrast, stimulating the more populous skin-resident macrophages with a TLR7 agonist elicited protective responses in key cellular targets of virus that otherwise proficiently replicated virus. By defining and targeting a key aspect of the innate immune response to virus at the mosquito bite site, we have identified a putative new strategy for limiting disease after infection with a variety of genetically distinct arboviruses.
infectious virus in exhaled breath of symptomatic seasonal influenza cases from a college community
jing yan et al. 2018
doi.org/10.1073/pnas.1716561115
human endogenous retrovirus-k hml-2 integration within rasgrf2 is associated with intravenous drug abuse and modulates transcription in a cell-line model
timokratis karamitros et al. 2018
doi.org/10.1073/pnas.1811940115
more than one contagion is reinforcing macroscopic patterns of interacting contagions are indistinguishable from social reinforcement
laurent hébert-dufresne et al. 2020
doi.org/10.1038/s41567-020-0791-2
“The interplay of diseases is the norm rather than the exception,” says Laurent Hébert-Dufresne, a complexity scientist at the University of Vermont who co-led the new research. “And yet when we model them, it’s almost always one disease in isolation.”
When disease modelers map an epidemic like coronavirus, Ebola, or the flu, they traditionally treat them as isolated pathogens. Under these so-called “simple” dynamics, it’s generally accepted that the forecasted size of the epidemic will be proportional to the rate of transmission.
But according to Hébert-Dufresne, professor of computer science at University of Vermont, and his co-authors, Samuel Scarpino at Northeastern University, and Jean-Gabriel Young at the University of Michigan, the presence of even one more contagion in the population can dramatically shift the dynamics from simple to complex. Once this shift occurs, microscopic changes in the transmission rate trigger macroscopic jumps in the expected epidemic size — a spreading pattern that social scientists have observed in the adoption of innovative technologies, slang, and other contagious social behaviors.
STAR WARS AND SNEEZING
The researchers first began to compare biological contagions and social contagions in 2015 at the Santa Fe Institute, a transdisciplinary research center where Hébert-Dufresne was modeling how social trends propagate through reinforcement. The classic example of social reinforcement, according to Hébert-Dufresne, is “the phenomenon through which ten friends telling you to go see the new Star Wars movie is different from one friend telling you the same thing ten times.”
Like multiple friends reinforcing a social behavior, the presence of multiple diseases makes an infection more contagious that it would be on its own. Biological diseases can reinforce each other through symptoms, as in the case of a sneezing virus that helps to spread a second infection like pneumonia. Or, one disease can weaken the host’s immune system, making the population more susceptible to a second, third, or additional contagion.
When diseases reinforce each other, they rapidly accelerate through the population, then fizzle out as they run out of new hosts. According to the researchers’ model, the same super-exponential pattern characterizes the spread of social trends, like viral videos, which are widely shared and then cease to be relevant after a critical mass of people have viewed them.
DENGUE AND ANTIVAXXERS
A second important finding is that the same complex patterns that arise for interacting diseases also arise when a biological contagion interacts with a social contagion, as in the example of a virus spreading in conjunction with an anti-vaccination campaign. The paper details a 2005 Dengue outbreak in Puerto Rico, and Hébert-Dufresne cites an additional example of a 2017 Dengue outbreak in Puerto Rico where failure to accurately account for the interplay of Dengue strains reduced the effectiveness of a Dengue vaccine. This in turn sparked an anti-vaccination movement — a social epidemic — that ultimately led to the resurgence of measles — a second biological epidemic. It’s a classic example of real-world complexity, where unintended consequences emerge from many interacting phenomena.
Although it is fascinating to observe a universal spreading pattern across complex social and biological systems, Hébert-Dufresne notes that it also presents a unique challenge. “Looking at the data alone, we could observe this complex pattern and not know whether a deadly epidemic was being reinforced by a virus, or by a social phenomenon, or some combination.”
“We hope this will open the door for more exciting models that capture the dynamics of multiple contagions,” he says. “Our work shows that it is time for the disease modeling community to move beyond looking at contagions individually.”
And the new study may shed light on the spread of coronavirus. “When making predictions, such as for the current coronavirus outbreak occurring in a flu season, it becomes important to know which cases have multiple infections and which patients are in the hospital with flu — but scared because of coronavirus,” Hébert-Dufresne says. “The interactions can be biological or social in nature, but they all matter.”
abstract From ‘fake news’ to innovative technologies, many contagions spread as complex contagions via a process of social reinforcement, where multiple exposures are distinct from prolonged exposure to a single source7. Contrarily, biological agents such as Ebola or measles are typically thought to spread as simple contagions8. Here, we demonstrate that these different spreading mechanisms can have indistinguishable population-level dynamics once multiple contagions interact. In the social context, our results highlight the challenge of identifying and quantifying spreading mechanisms, such as social reinforcement9, in a world where an innumerable number of ideas, memes and behaviours interact. In the biological context, this parallel allows the use of complex contagions to effectively quantify the non-trivial interactions of infectious diseases.
major features of immunesenescence, including reduced thymic output, are ameliorated by high levels of physical activity in adulthood
niharika arora duggal et al. 2018
doi.org/10.1111/acel.12750
properties of the vastus lateralis muscle in relation to age and physiological function in master cyclists aged 55-79 years
ross d. pollock et al. 2018
doi.org/10.1111/acel.12735
aging-related changes in fluid intelligence, muscle and adipose mass, and sex-specific immunologic mediation: a longitudinal uk biobank study
brandon s. klinedinst et al. 2019
doi.org/10.1016/j.bbi.2019.09.008
looked at data from more than 4,000 middle-aged to older UK Biobank participants, both men and women. The researchers examined direct measurements of lean muscle mass, abdominal fat, and subcutaneous fat, and how they were related to changes in fluid intelligence over six years.
Willette and Klinedinst discovered people mostly in their 40s and 50s who had higher amounts of fat in their mid-section had worse fluid intelligence as they got older. Greater muscle mass, by contrast, appeared to be a protective factor. These relationships stayed the same even after taking into account chronological age, level of education, and socioeconomic status.
“Chronological age doesn’t seem to be a factor in fluid intelligence decreasing over time,” Willette said. “It appears to be biological age, which here is the amount of fat and muscle.”
Generally, people begin to gain fat and lose lean muscle once they hit middle age, a trend that continues as they get older. To overcome this, implementing exercise routines to maintain lean muscle becomes more important. Klinedinst said exercising, especially resistance training, is essential for middle-aged women, who naturally tend to have less muscle mass than men.
The study also looked at whether or not changes in immune system activity could explain links between fat or muscle and fluid intelligence. Previous studies have shown that people with a higher body mass index (BMI) have more immune system activity in their blood, which activates the immune system in the brain and causes problems with cognition. BMI only takes into account total body mass, so it has not been clear whether fat, muscle, or both jump-start the immune system.
In this study, in women, the entire link between more abdominal fat and worse fluid intelligence was explained by changes in two types of white blood cells: lymphocytes and eosinophils. In men, a completely different type of white blood cell, basophils, explained roughly half of the fat and fluid intelligence link. While muscle mass was protective, the immune system did not seem to play a role.
While the study found correlations between body fat and decreased fluid intelligence, it is unknown at this time if it could increase the risk of Alzheimer’s disease.
“Further studies would be needed to see if people with less muscle mass and more fat mass are more likely to develop Alzheimer’s disease, and what the role of the immune system is,” Klinedinst said.
Starting a New Year’s resolution now to work out more and eat healthier may be a good idea, not only for your overall health, but to maintain healthy brain function.
“If you eat alright and do at least brisk walking some of the time, it might help you with mentally staying quick on your feet,” Willette said.
abstract •Adiposity exacerbated cognitive aging.
•Greater muscle mass was protective against cognitive aging.
•The effect of muscle on cognition was more than adiposity.
•Lymphocytes, eosinophils, and basophils may link adiposity to cognitive outcomes.
•Sex-specific mechanisms of action were noted among eosinophils and basophils.
vitamin d
a canary in the coalmine
the association between neonatal vitamin d status and risk of schizophrenia
darryl w. eyles et al. 2018
doi.org/10.1038/s41598-018-35418-z
newborns with vitamin D deficiency had a 44 per cent increased risk of being diagnosed with schizophrenia as adults compared to those with normal vitamin D levels.
"Schizophrenia is a group of poorly understood brain disorders characterised by symptoms such as hallucinations, delusions and cognitive impairment," he said.
"As the developing fetus is totally reliant on the mother's vitamin D stores, our findings suggest that ensuring pregnant women have adequate levels of vitamin D may result in the prevention of some schizophrenia cases, in a manner comparable to the role folate supplementation has played in the prevention of spina bifida."
Professor McGrath, of UQ's Queensland Brain Institute, said the study, which was based on 2602 individuals, confirmed a previous study he led that also found an association between neonatal vitamin D deficiency and an increased risk of schizophrenia.
The team made the discovery by analysing vitamin D concentration in blood samples taken from Danish newborns between 1981 and 2000 who went on to develop schizophrenia as young adults.
The researchers compared the samples to those of people matched by sex and date of birth who had not developed schizophrenia.
Professor McGrath said schizophrenia is associated with many different risk factors, both genetic and environmental, but the research suggested that neonatal vitamin D deficiency could possibly account for about eight per cent of schizophrenia cases in Denmark.
"Much of the attention in schizophrenia research has been focused on modifiable factors early in life with the goal of reducing the burden of this disease," he said.
"Previous research identified an increased risk of schizophrenia associated with being born in winter or spring and living in a high-latitude country, such as Denmark.
"We hypothesised that low vitamin D levels in pregnant women due to a lack of sun exposure during winter months might underlie this risk, and investigated the association between vitamin D deficiency and risk of schizophrenia."
Professor McGrath said that although Australia had more bright sunshine compared to Denmark, vitamin D deficiency could still be found in pregnant women in Australia because of our lifestyle and sun-safe behaviour.
Professor McGrath, who holds a Niels Bohr Professorship at Aarhus University, also led a 2016 Dutch study that found a link between prenatal vitamin D deficiency and increased risk of childhood autism traits.
"The next step is to conduct randomised clinical trials of vitamin D supplements in pregnant women who are vitamin D deficient, in order to examine the impact on child brain development and risk of neurodevelopmental disorders such as autism and schizophrenia."
abstract Clues from the epidemiology of schizophrenia, such as the increased risk in those born in winter/spring, have led to the hypothesis that prenatal vitamin D deficiency may increase the risk of later schizophrenia. We wish to explore this hypothesis in a large Danish case-control study (n = 2602). The concentration of 25 hydroxyvitamin D (25OHD) was assessed from neonatal dried blood samples. Incidence rate ratios (IRR) were calculated when examined for quintiles of 25OHD concentration. In addition, we examined statistical models that combined 25OHD concentration and the schizophrenia polygenic risk score (PRS) in a sample that combined the new sample with a previous study (total n = 3464; samples assayed and genotyped between 2008-2013). Compared to the reference (fourth) quintile, those in the lowest quintile (<20.4 nmol/L) had a significantly increased risk of schizophrenia (IRR = 1.44, 95%CI: 1.12–1.85). None of the other quintile comparisons were significantly different. There was no significant interaction between 25OHD and the PRS. Neonatal vitamin D deficiency was associated with an increased risk for schizophrenia in later life. These findings could have important public health implications related to the primary prevention of schizophrenia.
relationships between urinary phthalate metabolite and bisphenol a concentrations and vitamin d levels in u.s. adults: national health and nutrition examination survey (nhanes), 2005–2010
lauren e. johns, kelly k. ferguson, john d. meeker 2016
doi.org/10.1210/jc.2016-2134
I have low blood vitamin d despite healthy diet and sunshine exposure
I predict a relationship between pesticide exposure (roundup) and blood vitamin d
estimation of the dietary requirement for vitamin d in adolescents aged 14-18 y: a dose-response, double-blind, randomized placebo-controlled trial
t. j. smith et al 2016
doi.org/10.3945/ajcn.116.138065
determinants of the maternal 25-hydroxyvitamin d response to vitamin d supplementation during pregnancy
rebecca j. moon et al. 2016
doi.org/10.1210/jc.2016-2869
extraskeletal effects of vitamin d: a clinical guide
leonid poretsky 2018 not yet read
the relation of magnesium and calcium intakes and a genetic polymorphism in the magnesium transporter to colorectal neoplasia risk
qi dai et al. 2018
doi.org/10.1093/ajcn/86.3.743
The study reported in the December issue of The American Journal of Clinical Nutrition is important because of controversial findings from ongoing research into the association of vitamin D levels with colorectal cancer and other diseases, including a recent report from the VITAL trial. It gave confirmation to a prior observational study in 2013 by the researchers that linked low magnesium levels with low vitamin D levels.
The trial also revealed something new -- that magnesium had a regulating effect in people with high vitamin D levels. The research provides the first evidence that magnesium may play an important role in optimizing vitamin D levels and preventing conditions related to vitamin D levels.
Qi Dai, MD, PhD, Ingram Professor of Cancer Research, the study's lead author, described the ideal level as being in the middle range of a U-shape because vitamin D at this level has been linked to the lowest risk of cardiovascular disease in previous observational studies.
However, vitamin D was not related to cardiovascular disease in the recent VITAL trial. He and Martha Shrubsole, PhD, research professor of Medicine, Division of Epidemiology, are investigating the role that magnesium may play with cancer as part of the Personalized Prevention of Colorectal Cancer Trial.
"There's a lot of information being debated about the relationship between vitamin D and colorectal cancer risk that is based upon observational studies versus clinical trials," Shrubsole said. "The information is mixed thus far."
They became interested in a role for magnesium because people synthesize vitamin D differently with levels of the vitamin in some individuals not rising even after being given high dosage supplements.
"Magnesium deficiency shuts down the vitamin D synthesis and metabolism pathway," Dai said.
The randomized study involved 250 people considered at risk for developing colorectal cancer because of either risk factors or having a precancerous polyp removed. Doses of magnesium and placebo were customized based on baseline dietary intake.
"Vitamin D insufficiency is something that has been recognized as a potential health problem on a fairly large scale in the U.S.," Shrubsole said. "A lot of people have received recommendations from their health care providers to take vitamin D supplements to increase their levels based upon their blood tests. In addition to vitamin D, however, magnesium deficiency is an under-recognized issue. Up to 80 percent of people do not consume enough magnesium in a day to meet the recommended dietary allowance (RDA) based on those national estimates."
Shrubsole stressed that the magnesium levels in the trial were in line with RDA guidelines, and she recommended dietary changes as the best method for increasing intake. Foods with high levels of magnesium include dark leafy greens, beans, whole grains, dark chocolate, fatty fish such as salmon, nuts and avocados.
abstract Background: Mean magnesium intake in the US population does not differ from that in East Asian populations with traditionally low risks of colorectal cancer and other chronic diseases, but the ratio of calcium to magnesium (Ca:Mg) intake is much higher in the US population. Transient receptor potential melastatin 7 (TRPM7) is a newly found gene essential to magnesium absorption and homeostasis.
Objective: We aimed to test whether the association of colorectal polyps with intake of calcium, magnesium, or both and Thr1482Ile polymorphism in the TRPM7 gene is modified by the Ca:Mg intake.
Design: Included in the study were a total of 688 adenoma cases, 210 hyperplastic polyp cases, and 1306 polyp-free controls from the Tennessee Colorectal Polyp Study.
Results: We found that total magnesium consumption was linked to a significantly lower risk of colorectal adenoma, particularly in those subjects with a low Ca:Mg intake. An inverse association trend was found for hyperplastic polyps. We also found that the common Thr1482Ile polymorphism was associated with an elevated risk of both adenomatous and hyperplastic polyps. Moreover, this polymorphism significantly interacted with the Ca:Mg intake in relation to both adenomatous and hyperplastic polyps. The subjects who carried ≥1 1482Ile allele and who consumed diets with a high Ca:Mg intake were at a higher risk of adenoma (odds ratio: 1.60; 95% CI: 1.12, 2.29) and hyperplastic polyps (odds ratio: 1.85; 95% CI: 1.09, 3.14) than were the subjects who did not carry the polymorphism.
Conclusion: These findings, if confirmed, may provide a new avenue for the personalized prevention of magnesium deficiency and, thus, colorectal cancer.
is sunscreen the new margarine?
rowan jacobsen 2019
outsideonline.com/2380751/sunscreen-sun-exposure-skin-cancer-science
vitamin a
resistin-like molecule α provides vitamin-a-dependent antimicrobial protection in the skin
tamia a. harris et al. 2019
doi.org/10.1016/j.chom.2019.04.004
one protein in the resistin-like molecule (RELM) family -- RELMα -- acts as an antibiotic to rapidly kill bacteria. Both RELMα, which is made by mice, and the corresponding human RELM family protein, called resistin, are stimulated by dietary vitamin A.
"RELMα is the first example of an antimicrobial protein that requires dietary vitamin A for its bacterial killing activity. This finding gives us an important clue about how the skin defends itself against infection, and how skin defense is regulated by the diet," said Dr. Lora Hooper, Chair of Immunology and corresponding author on the study published in Cell Host & Microbe.
Dermatologists use synthetic vitamin A, called retinoid, to treat acne, psoriasis, and other skin conditions, although how those drugs work has long been a mystery.
"The skin is the largest organ of the human body and is tasked with defending us against infection," said Dr. Tamia Harris-Tryon, Assistant Professor of Dermatology and Immunology.
"If the skin immune system breaks down, infection results. Skin infections, from bacteria such as Streptococcus, are among the most common reasons people come to the emergency room," added Dr. Harris-Tryon, a physician-scientist who completed postdoctoral training in the Hooper lab.
Dr. Hooper is well known for her research on the commensal or "good" bacteria that inhabit the gut -- where they aid in digestion and infection control.
The team's experiments in human tissue and mice illuminate a previously unappreciated link between diet and innate immunity of the skin, suggesting why vitamin A derivatives are effective treatments for skin disease, said Dr. Hooper, a Howard Hughes Medical Institute Investigator who is also a UTSW Professor of Immunology and Microbiology with an additional appointment in the Center for the Genetics of Host Defense. Dr. Hooper holds the Jonathan W. Uhr, M.D. Distinguished Chair in Immunology and is a Nancy Cain and Jeffrey A. Marcus Scholar in Medical Research, in Honor of Dr. Bill S. Vowell.
In addition to identifying RELMα's unique feature -- its requirement for dietary vitamin A to kill bacteria -- the team showed that mice fed a diet deficient in vitamin A made no RELMα. The researchers also found that mice missing RELMα were more susceptible to infection and had different bacterial species on their skin compared with typical mice.
Dr. Harris-Tryon added, "Considering how often retinoids are used in dermatology, the implications of our findings are potentially vast. The skin is an important interface between us and the environment and must defend us against infection and inflammation. We are just beginning to understand how bacteria and the microbiome (the term for the population of bacteria living with us) impact skin diseases such as psoriasis and acne. Our work helps to define the molecules that the skin uses to create a healthy relationship between the microbiome and us, the hosts."
abstract •Skin microbiota induces epidermal RELMα, which kills bacteria via membrane disruption
•RELMα-deficient mice have altered skin microbiota and are more susceptible to infection
•Dietary vitamin A is required for RELMα expression
•RELMα is required for vitamin-A-dependent resistance to skin infection
Vitamin A deficiency increases susceptibility to skin infection. However, the mechanisms by which vitamin A regulates skin immunity remain unclear. Here, we show that resistin-like molecule α (RELMα), a small secreted cysteine-rich protein, is expressed by epidermal keratinocytes and sebocytes and serves as an antimicrobial protein that is required for vitamin-A-dependent resistance to skin infection. RELMα was induced by microbiota colonization of the murine skin, was bactericidal in vitro, and was protected against bacterial infection of the skin in vivo. RELMα expression required dietary vitamin A and was induced by the therapeutic vitamin A analog isotretinoin, which protected against skin infection in a RELMα-dependent manner. The RELM family member Resistin was expressed in human skin, was induced by vitamin A analogs, and killed skin bacteria, indicating a conserved function for RELM proteins in skin innate immunity. Our findings provide insight into how vitamin A promotes resistance to skin infection.
emergence and spread of basal lineages of yersinia pestis during the neolithic decline
nicolás rascovan et al. 2018
doi.org/10.1016/j.cell.2018.11.005
To better understand the evolutionary history of the plague, Rasmussen and his colleagues trawled through publicly available genetic data from ancient humans, screening for sequences similar to more modern plague strains. They found a strain they had never seen before in the genetic material of a 20-year-old woman who died approximately 5,000 years ago in Sweden. The strain had the same genes that make the pneumonic plague deadly today and traces of it were also found in another individual at the same grave site -- suggesting that the young woman did likely die of the disease.
This strain of the plague is the oldest that's ever been discovered. But what makes it particularly interesting is that, by comparing it to other strains, the researchers were able to determine that it's also the most basal -- meaning that it's the closest strain we have to the genetic origin of Y. pestis. It likely diverged from other strains around 5,700 years ago, while the plague that was common in the Bronze Age and the plague that is the ancestor of the strains in existence today diverged 5,300 and 5,100 years ago, respectively. This suggests that there were multiple strains of plague in existence at the end of the Neolithic period.
Rasmussen also believes that this finding offers a new theory about how plague spreads. Massive human migrations from the Eurasian steppe down into Europe are known to have occurred around 5,000 years ago, but how these cultures were able to displace the Neolithic farming culture that was present in Europe at the time is still debated. Previous researchers have suggested that the invaders brought the plague with them, wiping out the large settlements of Stone Age farmers when they arrived.
But if the strain of plague the researchers found in the Swedish woman diverged from the rest of Y. pestis 5,700 years ago, that means it likely evolved before these migrations began and around the time that the Neolithic European settlements were already starting to collapse.
At the time, mega-settlements of 10,000-20,000 inhabitants were becoming common in Europe, which made job specialization, new technology, and trade possible. But they also may have been the breeding ground for plague. "These mega-settlements were the largest settlements in Europe at that time, ten times bigger than anything else. They had people, animals, and stored food close together, and, likely, very poor sanitation. That's the textbook example of what you need to evolve new pathogens," says Rasmussen.
"We think our data fit. If plague evolved in the mega-settlements, then when people started dying from it, the settlements would have been abandoned and destroyed. This is exactly what was observed in these settlements after 5,500 years ago. Plague would also have started migrating along all the trade routes made possible by wheeled transport, which had rapidly expanded throughout Europe in this period," he says.
Eventually, he suggests, the plague would have arrived through these trade interactions at the small settlement in Sweden where the woman his team studied lived. Rasmussen argues that the woman's own DNA also provides further evidence for this theory -- she isn't genetically related to the people who invaded Europe from the Eurasian steppe, supporting the idea that this strain of plague arrived before the mass migrations did. The archaeology also supports this hypothesis, as there were still no signs of the invaders by the time she died.
Of course, there are some limitations to what the data from this study can tell us. Most importantly, the researchers have not yet identified the plague in individuals from the mega-settlements where it may have evolved. "We haven't really found the smoking gun, but it's partly because we haven't looked yet. And we'd really like to do that, because if we could find plague in those settlements, that would be strong support for this theory," says Rasmussen.
Regardless, he believes that this study is a step toward understanding how plague -- and other pathogens -- became deadly. "We often think that these superpathogens have always been around, but that's not the case," he says. "Plague evolved from an organism that was relatively harmless. More recently, the same thing happened with smallpox, malaria, Ebola, and Zika. This process is very dynamic -- and it keeps happening. I think it's really interesting to try to understand how we go from something harmless to something extremely virulent."
abstract •Discovery of the most ancient case of plague in humans, 4,900 years ago in Sweden
•Basal lineages of Y. pestis emerged and spread during the Neolithic decline
•Plague infections in distinct Eurasian populations during Neolithic and Bronze Age
•A plague pandemic likely emerged in large settlements and spread over trade routes
Between 5,000 and 6,000 years ago, many Neolithic societies declined throughout western Eurasia due to a combination of factors that are still largely debated. Here, we report the discovery and genome reconstruction of Yersinia pestis, the etiological agent of plague, in Neolithic farmers in Sweden, pre-dating and basal to all modern and ancient known strains of this pathogen. We investigated the history of this strain by combining phylogenetic and molecular clock analyses of the bacterial genome, detailed archaeological information, and genomic analyses from infected individuals and hundreds of ancient human samples across Eurasia. These analyses revealed that multiple and independent lineages of Y. pestis branched and expanded across Eurasia during the Neolithic decline, spreading most likely through early trade networks rather than massive human migrations. Our results are consistent with the existence of a prehistoric plague pandemic that likely contributed to the decay of Neolithic populations in Europe.
disease tolerance
mitochondrial cyclophilin d regulates t cell metabolic responses and disease tolerance to tuberculosis
fanny tzelepis et al. 2018
doi.org/10.1126/sciimmunol.aar4135
footwear matters: influence of footwear and foot strike on load rates during running
rice hm, et al. 2016
doi.org/10.1249/MSS.0000000000001030
influence of maximal running shoes on biomechanics before and after a 5k run
christine d. pollard et al. 2018
doi.org/10.1177/2325967118775720
differences in perceived causes of childhood obesity between migrant and local communities in china: a qualitative study
bai li et al. 2017
doi.org/10.1371/journal.pone.0177505
indoles from commensal bacteria extend healthspan
robert sonowal et al. 2017
doi.org/10.1073/pnas.1706464114
Increases in human life expectancy over the next century will be accompanied by increased frailty and massive and unsustainable health care costs. Developing means to extend the time that individuals remain healthy and free of age-related infirmities, called healthspan, has therefore become a critical goal of aging research. We show that small molecules produced by the microbiota and related to indole extend healthspan in geriatric worms, flies, and mice, without attendant effects on lifespan. Indoles act via the aryl hydrocarbon receptor and cause animals to retain a youthful gene expression profile. Indoles may represent a new class of therapeutics that improve the way we age as opposed to simply extending how long we live.
Multiple studies have identified conserved genetic pathways and small molecules associated with extension of lifespan in diverse organisms. However, extending lifespan does not result in concomitant extension in healthspan, defined as the proportion of time that an animal remains healthy and free of age-related infirmities. Rather, mutations that extend lifespan often reduce healthspan and increase frailty. The question arises as to whether factors or mechanisms exist that uncouple these processes and extend healthspan and reduce frailty independent of lifespan. We show that indoles from commensal microbiota extend healthspan of diverse organisms, including Caenorhabditis elegans, Drosophila melanogaster, and mice, but have a negligible effect on maximal lifespan. Effects of indoles on healthspan in worms and flies depend upon the aryl hydrocarbon receptor (AHR), a conserved detector of xenobiotic small molecules. In C. elegans, indole induces a gene expression profile in aged animals reminiscent of that seen in the young, but which is distinct from that associated with normal aging. Moreover, in older animals, indole induces genes associated with oogenesis and, accordingly, extends fecundity and reproductive span. Together, these data suggest that small molecules related to indole and derived from commensal microbiota act in diverse phyla via conserved molecular pathways to promote healthy aging. These data raise the possibility of developing therapeutics based on microbiota-derived indole or its derivatives to extend healthspan and reduce frailty in humans.
enhanced protein translation underlies improved metabolic and physical adaptations to different exercise training modes in young and old humans
robinson et al. 2017
doi.org/10.1016/j.cmet.2017.02.009
•High-intensity interval training improved age-related decline in muscle mitochondria
•Training adaptations occurred with increased gene transcripts and ribosome proteins
•Changes to RNA with training had little overlap with corresponding protein abundance
•Enhanced ribosomal abundance and protein synthesis explain gains in mitochondria
Summary
The molecular transducers of benefits from different exercise modalities remain incompletely defined. Here we report that 12 weeks of high-intensity aerobic interval (HIIT), resistance (RT), and combined exercise training enhanced insulin sensitivity and lean mass, but only HIIT and combined training improved aerobic capacity and skeletal muscle mitochondrial respiration. HIIT revealed a more robust increase in gene transcripts than other exercise modalities, particularly in older adults, although little overlap with corresponding individual protein abundance was noted. HIIT reversed many age-related differences in the proteome, particularly of mitochondrial proteins in concert with increased mitochondrial protein synthesis. Both RT and HIIT enhanced proteins involved in translational machinery irrespective of age. Only small changes of methylation of DNA promoter regions were observed. We provide evidence for predominant exercise regulation at the translational level, enhancing translational capacity and proteome abundance to explain phenotypic gains in muscle mitochondrial function and hypertrophy in all ages.
orthostatic hypotension and the long-term risk of dementia: a population-based study
frank j. wolters et al. 2016
doi.org/10.1371/journal.pmed.1002143
effect of restricted salt intake on nocturia
tomohiro, m., nakamura, y., yasuda, t., ohba, k., miyata, y., sakai, h 2017
uroweb.org/wp-content/uploads/Night-time-urination-reduced-by-cutting-salt-in-diet.pdf
smell loss predicts mortality risk regardless of dementia conversion
jonas olofsson et al. 2017
doi.org/10.1111/jgs.14770
astrocytes regulate daily rhythms in the suprachiasmatic nucleus and behavior
chak foon tso et al. 2017
doi.org/10.1016/j.cub.2017.02.037
characterization of a human-specific tandem repeat associated with bipolar disorder and schizophrenia
janet h.t. song et al. 2018
doi.org/10.1016/j.ajhg.2018.07.011
fasting
effects of intermittent fasting on health, aging, and disease
rafael de cabo, mark p. mattson 2019
doi.org/10.1056/nejmra1905136
Intermittent fasting diets, he says, fall generally into two categories: daily time-restricted feeding, which narrows eating times to 6-8 hours per day, and so-called 5:2 intermittent fasting, in which people limit themselves to one moderate-sized meal two days each week.
An array of animal and some human studies have shown that alternating between times of fasting and eating supports cellular health, probably by triggering an age-old adaptation to periods of food scarcity called metabolic switching. Such a switch occurs when cells use up their stores of rapidly accessible, sugar-based fuel, and begin converting fat into energy in a slower metabolic process.
Mattson says studies have shown that this switch improves blood sugar regulation, increases resistance to stress and suppresses inflammation. Because most Americans eat three meals plus snacks each day, they do not experience the switch, or the suggested benefits.
In the article, Mattson notes that four studies in both animals and people found intermittent fasting also decreased blood pressure, blood lipid levels and resting heart rates.
Evidence is also mounting that intermittent fasting can modify risk factors associated with obesity and diabetes, says Mattson. Two studies at the University Hospital of South Manchester NHS Foundation Trust of 100 overweight women showed that those on the 5:2 intermittent fasting diet lost the same amount of weight as women who restricted calories, but did better on measures of insulin sensitivity and reduced belly fat than those in the calorie-reduction group.
More recently, Mattson says, preliminary studies suggest that intermittent fasting could benefit brain health too. A multicenter clinical trial at the University of Toronto in April found that 220 healthy, nonobese adults who maintained a calorie restricted diet for two years showed signs of improved memory in a battery of cognitive tests. While far more research needs to be done to prove any effects of intermittent fasting on learning and memory, Mattson says if that proof is found, the fasting — or a pharmaceutical equivalent that mimics it — may offer interventions that can stave off neurodegeneration and dementia.
“We are at a transition point where we could soon consider adding information about intermittent fasting to medical school curricula alongside standard advice about healthy diets and exercise,” he says.
Mattson acknowledges that researchers do “not fully understand the specific mechanisms of metabolic switching and that “some people are unable or unwilling to adhere” to the fasting regimens. But he argues that with guidance and some patience, most people can incorporate them into their lives. It takes some time for the body to adjust to intermittent fasting, and to get beyond initial hunger pangs and irritability that accompany it. “Patients should be advised that feeling hungry and irritable is common initially and usually passes after two weeks to a month as the body and brain become accustomed to the new habit,” Mattson says.
To manage this hurdle, Mattson suggests that physicians advise patients to gradually increase the duration and frequency of the fasting periods over the course of several months, instead of “going cold turkey.”
abstract Evidence is accumulating that eating in a 6-hour period and fasting for 18 hours can trigger a metabolic switch from glucose-based to ketone-based energy, with increased stress resistance, increased longevity, and a decreased incidence of diseases, including cancer and obesity.
dietary restriction extends lifespan through metabolic regulation of innate immunity
ziyun wu et al. 2019
doi.org/10.1016/j.cmet.2019.02.013
mechanism of lifespan extension that links caloric restriction with immune system regulation.
"Modulating immune activity is an important aspect of dietary restriction," says Keith Blackwell, MD, PhD, Associate Research Director and Senior Investigator at Joslin, and Professor of Genetics at Harvard Medical School, senior author on the paper. "And it is important for longevity regulation and, in this context, increasing lifespan."
In this study, Dr. Blackwell and his team found that caloric restriction reduces levels of innate immunity by decreasing the activity of a regulatory protein called p38, triggering a chain reaction effect ending in a reduced immune response.
Innate immunity is like the security guard of the body, keeping an eye out for any unwelcome bacteria or viruses. If the innate immune system spots something, it activates an acute immune response. We need some degree of both kinds of immunity to stay healthy, but an overactive innate immune system -- which occurs more often as we age -- means constant low-grade inflammation, which can lead to myriad health issues.
"[Before this study,] people looked what happens to immunity to aging in humans, but no one had ever looked in any organism at whether modulating immunity or its activities is involved in lifespan extension or can be beneficial as part of an anti-aging program," says Dr. Blackwell.
The research was conducted in the microscopic nematode worm C. elegans. The most fundamental genes and regulatory mechanisms found in these worms are typically simpler versions of those present in humans, making them a good model for studying human aging, genetics, and disease.
Dr. Blackwell and his team analyzed the levels of proteins and actions of genetic pathways during periods of caloric restriction. They were able to zero in on a particular genetic pathway that was regulated by the p38 protein. They saw that when p38 was totally inactive, caloric restriction failed and had no impact on innate immunity. When it was active, but at lower levels than normal, it triggered the genetic pathways that turned down the innate immune response to an optimal level.
"That was the most surprising thing we found. The pathway was down regulated even though it was critical," says Dr. Blackwell.
That this immune-regulating response was activated by nutrients, rather than bacteria, was also surprising. This adds to a growing body of evidence tying metabolism to the immune system.
"This is really an emerging field in mammals now, so called immunometabolism -- the idea that there are ancient links between metabolism and immunity," says Dr. Blackwell. "We were able to show in this really very primitive immune system that it is regulated metabolically, and affects lifespan and health independently of an anti-pathogen function. That is when I started calling this a primitive immunometabolic pathway or an immunometabolic regulation."
After making this discovery, Dr. Blackwell was curious to know if the well-known longevity mechanism of reduced IGF1 signaling also acted on the immune system. For over 20 years, study after study in many different organisms have confirmed that lower levels of IGF1 signaling contributes to a longer lifespan. This is thought to be due to the activation of protective factors by a protein called FOXO (called DAF-16 in C. elegans).
In this new study, Dr. Blackwell and his team discovered that when IGF signaling was reduced in the worms, the chain reaction set off by the FOXO-like DAF-16 not only boosted protective mechanisms, but also led to a reduction of the worms' appetites. This naturally put the subjects in a state of caloric restriction.
"This links the growth mechanism [of IGF1 signaling] to food consumption and food seeking behavior in a big way," says Dr. Blackwell.
A reduction in the activity of the FOXO-like gene seems to tell the worms that they are in a fasting-like state, and that nutrients may be scarce. This directs the worms to conserve energy, leading to a reduction in food intake. This self-imposed caloric restriction then leads to the lowering of the innate immune response. Dr. Blackwell plans to further study how the FOXO protein acts to suppress appetite, and to understand whether that might eventually lead to drug development.
The genes responsible for the phenomena observed in this study are conserved in humans. This opens up the possibility of human medical applications, from optimizing the immune system to drug development for appetite control.
"The ultimate goal is to be able to manipulate healthy lifespan in a person," says Dr. Blackwell. "Not to make people live to 120, 130, but to extend the period of healthy life. And chronic inflammation is a major factor in human aging. The hope is that some of the specific mechanisms could translate to optimizing immune function in humans during aging to enhance health in human lifespan."
abstract
•Dietary restriction longevity requires modulation of nutrient-regulated immunity
•Nutrients activate the p38–ATF-7 immunometabolic pathway independently of mTORC1
•Insulin/IGF-1 signaling affects immunity and aging in part by curtailing food intake
•DAF-16/FOXO lowers food consumption, linking feeding and immunity to growth signals
Chronic inflammation predisposes to aging-associated disease, but it is unknown whether immunity regulation might be important for extending healthy lifespan. Here we show that in C. elegans, dietary restriction (DR) extends lifespan by modulating a conserved innate immunity pathway that is regulated by p38 signaling and the transcription factor ATF-7. Longevity from DR depends upon p38–ATF-7 immunity being intact but downregulated to a basal level. p38–ATF-7 immunity accelerates aging when hyperactive, influences lifespan independently of pathogen exposure, and is activated by nutrients independently of mTORC1, a major DR mediator. Longevity from reduced insulin/IGF-1 signaling (rIIS) also involves p38–ATF-7 downregulation, with signals from DAF-16/FOXO reducing food intake. We conclude that p38–ATF-7 is an immunometabolic pathway that senses bacterial and nutrient signals, that immunity modulation is critical for DR, and that DAF-16/FOXO couples appetite to growth regulation. These conserved mechanisms may influence aging in more complex organisms.
opposing effects of fasting metabolism on tissue tolerance in bacterial and viral inflammation
wang et al. 2016
doi.org/10.1016/j.cell.2016.07.026
caloric restriction reprograms the single-cell transcriptional landscape of rattus norvegicus aging
shuai ma et al. 2020
doi.org/10.1016/j.cell.2020.02.008
“We already knew that calorie restriction increases life span, but now we’ve shown all the changes that occur at a single-cell level to cause that,” says Juan Carlos Izpisua Belmonte, a senior author of the new paper, professor in Salk’s Gene Expression Laboratory and holder of the Roger Guillemin Chair. “This gives us targets that we may eventually be able to act on with drugs to treat aging in humans.”
Aging is the highest risk factor for many human diseases, including cancer, dementia, diabetes and metabolic syndrome. Caloric restriction has been shown in animal models to be one of the most effective interventions against these age-related diseases. And although researchers know that individual cells undergo many changes as an organism ages, they have not known how caloric restriction might influence these changes.
In the new paper, Belmonte and his collaborators — including three alumni of his Salk lab who are now professors running their own research programs in China — compared rats who ate 30 percent fewer calories with rats on normal diets. The animals’ diets were controlled from age 18 months through 27 months. (In humans, this would be roughly equivalent to someone following a calorie-restricted diet from age 50 through 70.)
At both the start and the conclusion of the diet, Belmonte’s team isolated and analyzed a total of 168,703 cells from 40 cell types in the 56 rats. The cells came from fat tissues, liver, kidney, aorta, skin, bone marrow, brain and muscle. In each isolated cell, the researchers used single-cell genetic-sequencing technology to measure the activity levels of genes. They also looked at the overall composition of cell types within any given tissue. Then, they compared old and young mice on each diet.
Many of the changes that occurred as rats on the normal diet grew older didn’t occur in rats on a restricted diet; even in old age, many of the tissues and cells of animals on the diet closely resembled those of young rats. Overall, 57 percent of the age-related changes in cell composition seen in the tissues of rats on a normal diet were not present in the rats on the calorie restricted diet.
“This approach not only told us the effect of calorie restriction on these cell types, but also provided the most complete and detailed study of what happens at a single-cell level during aging,” says co-corresponding author Guang-Hui Liu, a professor at the Chinese Academy of Sciences.
Some of the cells and genes most affected by the diet related to immunity, inflammation and lipid metabolism. The number of immune cells in nearly every tissue studied dramatically increased as control rats aged but was not affected by age in rats with restricted calories. In brown adipose tissue — one type of fat tissue — a calorie-restricted diet reverted the expression levels of many anti-inflammatory genes to those seen in young animals.
“The primary discovery in the current study is that the increase in the inflammatory response during aging could be systematically repressed by caloric restriction” says co-corresponding author Jing Qu, also a professor at the Chinese Academy of Sciences.
When the researchers homed in on transcription factors — essentially master switches that can broadly alter the activity of many other genes — that were altered by caloric restriction, one stood out. Levels of the transcription factor Ybx1 were altered by the diet in 23 different cell types. The scientists believe Ybx1 may be an age-related transcription factor and are planning more research into its effects.
“People say that ‘you are what you eat,’ and we’re finding that to be true in lots of ways,” says Concepcion Rodriguez Esteban, another of the paper’s authors and a staff researcher at Salk. “The state of your cells as you age clearly depends on your interactions with your environment, which includes what and how much you eat.”
abstract •A multitissue single-cell transcriptomic atlas for aging and CR in a mammal
•CR alleviates aging-related accumulation of pro-inflammatory cells in various tissues
•CR attenuates aging-associated cell-type-specific gene expression changes
Aging causes a functional decline in tissues throughout the body that may be delayed by caloric restriction (CR). However, the cellular profiles and signatures of aging, as well as those ameliorated by CR, remain unclear. Here, we built comprehensive single-cell and single-nucleus transcriptomic atlases across various rat tissues undergoing aging and CR. CR attenuated aging-related changes in cell type composition, gene expression, and core transcriptional regulatory networks. Immune cells were increased during aging, and CR favorably reversed the aging-disturbed immune ecosystem. Computational prediction revealed that the abnormal cell-cell communication patterns observed during aging, including the excessive proinflammatory ligand-receptor interplay, were reversed by CR. Our work provides multi-tissue single-cell transcriptional landscapes associated with aging and CR in a mammal, enhances our understanding of the robustness of CR as a geroprotective intervention, and uncovers how metabolic intervention can act upon the immune system to modify the process of aging.
vector competence of aedes aegypti, culex tarsalis, and culex quinquefasciatus from california for zika virus
bradley j. main et al. 2018
doi.org/10.1371/journal.pntd.0006524
zika–capable mosquitoes in washington dc
lima et al 2016
doi.org/10.4269/ajtmh.15-0351
Published online November 2, 2015 , doi: 10.4269/ajtmh.15-0351 Am J Trop Med Hyg 2016 vol. 94 no. 1 231-235
assuming that a disease is limited by the “known range” of its known vectors is a mistake
travel.trade.gov visits to usa numbers
Link: travel.trade.gov/outreachpages/inbound.general_information.inbound_overview.html
travel.trade.gov visits from usa numbers
Link: travel.trade.gov/view/m-2014-O-001/index.html
refugees and asylees accepted by usa numbers
Link: migrationpolicy.org/article/refugees-and-asylees-united-states/
bacterial colonization and succession in a newly opened hospital
simon lax et al. 2017
doi.org/10.1126/scitranslmed.aah6500
ecological analyses of mycobacteria in showerhead biofilms and their relevance to human health
matthew j. gebert et al. 2018
doi.org/10.1128/mBio.01614-18
biosurfactants change the thinning of contaminated bubbles at bacteria-laden water interfaces
s. poulain, l. bourouiba 2018
doi.org/10.1103/PhysRevLett.121.204502
deposition of respiratory virus pathogens on frequently touched surfaces at airports
niina ikonen et al. 2018
doi.org/10.1186/s12879-018-3150-5
from ward to washer: the survival of clostridium difficile spores on hospital bed sheets through a commercial uk nhs healthcare laundry process
joanna tarrant et al. 2018
doi.org/10.1017/ice.2018.255
sink traps as the source of transmission of oxa-48–producing serratia marcescens in an intensive care unit
gili regev-yochay et al. 2018
doi.org/10.1017/ice.2018.235
the calendar of epidemics: seasonal cycles of infectious diseases
micaela elvira martinez 2018
doi.org/10.1371/journal.ppat.1007327
analysing the link between public transport use and airborne transmission: mobility and contagion in the london underground
lara goscé, anders johansson 2018
doi.org/10.1186/s12940-018-0427-5
By comparing Oyster card route information and Public Health England data on flu-like illnesses, Dr Lara Goscé from the University of Bristol's Department of Civil Engineering and Dr Anders Johansson from Bristol's Department of Engineering Mathematics, discovered higher rates of airborne infections in Londoners that have longer tube journeys through busier terminals.
Dr Goscé explained: "Higher rates [of influenza-like cases] can be observed in boroughs served by a small number of underground lines: passengers starting their journey in these boroughs usually have to change lines once or more in crowded junctions such as King's Cross in order to reach their final destination.
"On the other hand, lower influenza-like rates are found in boroughs where either the population do not use public transport as the main form of transport to commute to work; or boroughs served by more underground lines, which guarantee faster trips with less stops and contacts with fewer people."
For instance, one finding highlighted that infection rates in residents of Islington, who often change lines at crowded Kings Cross St. Pancreas, were nearly three times higher than in commuters from Kensington, who mostly take direct trains.
The team hopes that their findings will inform Government epidemic policies. Dr Goscé said: "Policy makers, in particular, should address the role potentially played by public transport and crowded events and avoid encouraging the attendance of such environments during epidemics."
Looking to the future, the group want to draw a clearer map of the spread of cold-like infections in a metropolitan environment, and so plan to combine individual level infection data with existing studies from households and schools.
Dr Goscé said: "These results are preliminary following limitations of the dataset. Empirical studies. Empirical studies combining aero-biology and pedestrian modelling would be important in improving model fidelity and devising non-pharmaceutical control strategies tackling threshold densities to minimise numbers of infections and optimal ventilation in different crowded environments."
abstract Background
The transmission of infectious diseases is dependent on the amount and nature of contacts between infectious and healthy individuals. Confined and crowded environments that people visit in their day-to-day life (such as town squares, business districts, transport hubs, etc) can act as hot-spots for spreading disease. In this study we explore the link between the use of public transport and the spread of airborne infections in urban environments.
Methods
We study a large number of journeys on the London Underground, which is known to be particularly crowded at certain times. We use publically available Oyster card data (the electronic ticket used for public transport in Greater London), to infer passengers’ routes on the underground network. In order to estimate the spread of a generic airborne disease in each station, we use and extend an analytical microscopic model that was initially designed to study people moving in a corridor.
Results
Comparing our results with influenza-like illnesses (ILI) data collected by Public Health England (PHE) in London boroughs, shows a correlation between the use of public transport and the spread of ILI. Specifically, we show that passengers departing from boroughs with higher ILI rates have higher number of contacts when travelling on the underground. Moreover, by comparing our results with other demographic key factors, we are able to discuss the role that the Underground plays in the spread of airborne infections in the English capital.
Conclusions
Our study suggests a link between public transport use and infectious diseases transmission and encourages further research into that area. Results could be used to inform the development of non-pharmacological interventions that can act on preventing instead of curing infections and are, potentially, more cost-effective.
prion seeding activity and infectivity in skin samples from patients with sporadic creutzfeldt-jakob disease
christina d. orrú et al. 2017
doi.org/10.1126/scitranslmed.aam7785
are ball pits located in physical therapy clinical settings a source of pathogenic microorganisms?
mary ellen oesterle et al. 2019
doi.org/10.1016/j.ajic.2018.09.031
The popularity of ball pits has increased since mainstream commercial restaurants installed them nationwide for children in the 1980s, and they are often found to be contaminated with visible dirt, vomit, feces, or urine, providing a permissive environment for contamination. Similar ball pits are commonly used in pediatric physical therapy to provide stimulation to children with sensory or motor impairments. According to the study, clinics may go days or even weeks between cleanings, allowing time for microorganisms to accumulate and grow to levels capable of giving children infections and making them sick.
Investigators from the University of North Georgia examined six ball pits located in inpatient physical therapy clinics or outpatient clinics in the state of Georgia. Nine to 15 balls were randomly selected from different depths of each sampled ball pit.
The study found considerable microbial colonization in ball pits that were tested, including eight bacteria and one yeast that could cause disease. Bacterial colonization was found to be as high as thousands of cells per ball, clearly demonstrating an increased potential for transmission of these organisms to patients and an increased possibility of infection.
"We found considerable variation in the number of microorganisms between the different ball pit samples," said the study's lead researcher Mary Ellen Oesterle, EdD, PT, Department of Physical Therapy, University of North Georgia, Dahlonega, GA, USA. "This suggests that clinics utilize different protocols for cleaning and maintenance, potentially representing a broader need to clarify and establish standards that reduce the risk of transmission."
Overall, researchers identified 31 bacterial species and one species of yeast. The human-associated bacteria found in the ball pits included:
Enterococcus faecalis, which can cause endocarditis, septicemia, urinary tract infection, and meningitis;
Staphylococcus hominis, a cause of bloodstream infections and reported as a cause of sepsis in a neonatal intensive care unit;
Streptococcus oralis, known to cause endocarditis, adult respiratory distress syndrome, and streptococcal shock;
Acinetobacter lwofii, which has been reported to cause septicemia, pneumonia, meningitis, and urinary tract and skin infections.
"This research shows that ball pits may pose an infection hazard," said 2019 APIC President Karen Hoffmann, RN, MS, CIC, FSHEA, FAPIC. "Facilities should establish a program for regular cleaning to protect patients and healthcare workers from potential infection risks."
abstract Clinical, therapeutic ball pits commonly used by physical therapists to provide sensory stimulation to children were investigated for microbial colonization. Due to the permissive and hospitable environment provided by these ball pits, microorganisms can accumulate to levels that increase the ease of transmission to exposed individuals. Our study found considerable microbial colonization in ball pits located in clinical settings, including 8 opportunistic pathogenic bacteria and 1 opportunistic pathogenic yeast.
The popularity of ball pits has increased in the general population since commercial restaurant chains installed them nationwide in the 1980s. Ball pits are often contaminated with visible dirt, vomit, feces, or urine providing an origin and permissive environmental factors for microbial contamination. Numerous types of bacteria have been identified in ball pits located in community settings, including normal human skin bacteria, as well as opportunistic pathogens such as Staphylococcus aureus and various enteric bacteria. Besides human-associated microorganisms, some ball pits have been found to contain zoonotic-associated organisms that have also been identified as causing serious infections in humans, including Pasteurella multocida.1
Ball pits are also commonly used in pediatric physical therapy to help provide stimulation to children with sensory and motor impairments. Currently, identification of national standards, or protocols, for cleaning these enclosures and their contents remains elusive. Accordingly, clinics may go days or even weeks between cleanings, which may allow time for microorganisms to accumulate and grow to levels capable of transmission and infection. This risk increases if the individual has skin lesions or abrasions, providing a portal of entry for immunocompromised individuals in general.
To ascertain the level of microbial colonization or presence in therapeutic ball pits at any given time, regardless of cleaning procedures, we conducted a study on 6 clinical ball pits located in community clinics. The goal of this study was to determine if there was a difference in the amount of bacteria found between each clinic and to identify the microorganisms found in the ball pits.
multidrug-resistant organisms in hospitals: what is on patient hands and in their rooms?
lona mody et al. 2019
doi.org/10.1093/cid/ciz092
Fourteen percent of 399 hospital patients tested in the study had "superbug" antibiotic-resistant bacteria on their hands or nostrils very early in their hospital stay, the research finds. And nearly a third of tests for such bacteria on objects that patients commonly touch in their rooms, such as the nurse call button, came back positive.
Another six percent of the patients who didn't have multi-drug resistant organisms, or MDROs, on their hands at the start of their hospitalization tested positive for them on their hands later in their stay. One-fifth of the objects tested in their rooms had similar superbugs on them too.
The research team cautions that the presence of MDROs on patients or objects in their rooms does not necessarily mean that patients will get sick with antibiotic-resistant bacteria. And they note that healthcare workers' hands are still the primary mode of microbe transmission to patients.
"Hand hygiene narrative has largely focused on physicians, nurses and other frontline staff, and all the policies and performance measurements have centered on them, and rightfully so," says Lona Mody, M.D., M.Sc., the University of Michigan geriatrician, epidemiologist and patient safety researcher who led the research team. "But our findings make an argument for addressing transmission of MDROs in a way that involves patients, too."
Studying the spread
Mody and her colleagues report in the new paper in Clinical Infectious Diseases that of the six patients in their study who developed an infection with a superbug called MRSA while in the hospital, all had positive tests for MRSA on their hands and hospital room surfaces.
In addition to MRSA, short for methicillin-resistant Staphylococcus aureus, the study looked for superbugs called VRE (vancomycin-resistant enterococcus) and a group called RGNB, for resistant Gram-negative bacteria. Because of overuse of antibiotics, these bacteria have evolved the ability to withstand attempts to treat infections with drugs that once killed them.
Mody notes that the study suggests that many of the MDROs seen on patients are also seen in their rooms early in their stay, suggesting that transmission to room surfaces is rapid. She heads the Infection Prevention in Aging research group at the U-M Medical School and VA Ann Arbor Healthcare System.
Additionally, since many patients arrive at the hospital through the emergency room, and may get tests in other areas before reaching their hospital room, it will be important to study the ecology of MDROs in those areas too, she says.
"This study highlights the importance of handwashing and environmental cleaning, especially within a healthcare setting where patients' immune systems are compromised," says infectious disease physician Katherine Reyes, M.D., lead author for Henry Ford Health System researchers involved in the study. "This step is crucial not only for healthcare providers, but also for patients and their families. Germs are on our hands; you do not need to see to believe it. And they travel. When these germs are not washed off, they pass easily from person to person and objects to person and make people sick."
More about the study
The team made more than 700 visits to the rooms of general medicine inpatients at two hospitals, working to enroll them in the study and take samples from their bodies and often-touched surfaces as early as possible in their stay. They were not able to test rooms before the patients arrived, and did not test patients who had had surgery, or were in intensive care or other types of units.
Using genetic fingerprinting techniques, they looked to see if the strains of MRSA bacteria on the patients' hands were the same as the ones in their rooms. They found the two matched in nearly all cases -- suggesting that transfer to and from the patient was happening. The technique is not able to distinguish the direction of transfer, whether it's from patient to objects in the room, or from those objects to patients.
Cleaning procedures for hospital rooms between patients, especially when a patient has been diagnosed with an MDRO infection, have improved over the years, says Mody, and research has shown them to be effective when used consistently. So lingering contamination from past patients may not have been a major factor.
But the question of exactly where patients picked up the MDROs that were found on their bodies, and were transmitted to the surfaces in their rooms, is not addressed by the current study and would be an important next step based on these results.
Why MDROs matter
Also important, says Mody, is the fact that hospital patients don't just stay in their rooms -- current practice encourages them to get up and walk in the halls as part of their recovery from many illnesses, and they may be transported to other areas of the hospital for tests and procedures.
As they travel, they may pick up MDROs from other patients and staff, and leave them on the surfaces they touch.
So even if a relatively healthy person has an MDRO on their skin, and their immune system can fight it off if it gets into their body, a more vulnerable person in the same hospital can catch it and get sick. The researchers are exploring studying MDROs on patients in other types of hospital units who may be more susceptible to infections.
Patients and staff may also get colonized with MDROs in outpatient care settings that have become the site of so much of American health care, including urgent care centers, freestanding imaging and surgery centers, and others.
Mody and colleagues are presenting new data about MDROs in skilled nursing facilities at an infectious disease conference in Europe in coming days. They showed that privacy curtains -- often used to separate patients staying in the same room, or to shield patients from view when dressing or being examined -- are also often colonized with superbugs.
"Infection prevention is everybody's business," says Mody, a professor of internal medicine at the U-M Medical School. "We are all in this together. No matter where you are, in a healthcare environment or not, this study is a good reminder to clean your hands often, using good techniques -- especially before and after preparing food, before eating food, after using a toilet, and before and after caring for someone who is sick -- to protect yourself and others."
abstract The impact of healthcare personnel hand contamination in multidrug-resistant organism (MDRO) transmission is important and well studied; however, the role of patient hand contamination needs to be characterized further.
Methods
Patients from 2 hospitals in southeast Michigan were recruited within 24 hours of arrival to their room and followed prospectively using microbial surveillance of nares, dominant hand, and 6 high-touch environmental surfaces. Sampling was performed on admission, days 3 and 7, and weekly until discharge. Paired samples of methicillin-resistant Staphylococcus aureus (MRSA) isolated from the patients’ hand and room surfaces were evaluated for relatedness using pulsed-field gel electrophoresis and staphylococcal cassette chromosome mec, and Panton-Valentine leukocidin typing.
Results
A total of 399 patients (mean age, 60.8 years; 49% male) were enrolled and followed for 710 visits. Fourteen percent (n = 56/399) of patients were colonized with an MDRO at baseline; 10% (40/399) had an MDRO on their hands. Twenty-nine percent of rooms harbored an MDRO. Six percent (14/225 patients with at least 2 visits) newly acquired an MDRO on their hands during their stay. New MDRO acquisition in patients occurred at a rate of 24.6/1000 patient-days, and in rooms at a rate of 58.6/1000 patient-days. Typing demonstrated a high correlation between MRSA on patient hands and room surfaces.
Conclusions
Our data suggest that patient hand contamination with MDROs is common and correlates with contamination on high-touch room surfaces. Patient hand hygiene protocols should be considered to reduce transmission of pathogens and healthcare-associated infections.
the hidden truth in the faucets: a qualityimprovement project and splash study of hospital sinks
kristen vanderelzen et al. 2019
doi.org/10.1016/j.ajic.2019.04.048
a shallow depth of the sink bowl enabled potentially contaminated water to splash onto patient care items, healthcare worker hands, and into patient care spaces -- at times at a distance of more than four feet from the sink itself.
"The inside of faucets where you can't clean were much dirtier than expected," said study author Kristen VanderElzen, MPH, CIC. "Potentially hazardous germs in and around sinks present a quandary for infection preventionists, since having accessible sinks for hand washing is so integral to everything we promote. Acting on the information we found, we have undertaken a comprehensive faucet replacement program across our hospital."
To identify the grime level of the sinks, the researchers used adenosine triphosphate (ATP) monitoring to measure the cleanliness. Visible biofilm was associated with higher ATP readings, and cultures tested over the course of the study grew Pseudomonas aeruginosa, mold, and other environmental organisms.
The research team also found aerators on sinks where they had previously been removed, pointing to an overall inconsistency of equipment protocols across the facility. Included in the design improvement program were sink guards, which were shown to limit splash significantly.
bioelectric regulation of innate immune system function in regenerating and intact xenopus laevis
jean-françois paré, christopher j. martyniuk, michael levin 2017
doi.org/10.1038/s41536-017-0019-y
c. elegans daf-16/foxo interacts with tgf-ß/bmp signaling to induce germline tumor formation via mtorc1 activation
wenjing qi et al. 2017
doi.org/10.1371/journal.pgen.1006801
prebiotic reduces body fat and alters intestinal microbiota in children with overweight or obesity
alissa c. nicolucci et al. 2017
doi.org/10.1053/j.gastro.2017.05.055
adaptive capacity: an evolutionary neuroscience model linking exercise, cognition, and brain health
david a. raichlen, gene e. alexander 2017
doi.org/10.1016/j.tins.2017.05.001
classification of common human diseases derived from shared genetic and environmental determinants
kanix wang, hallie gaitsch, hoifung poon, nancy j cox, andrey rzhetsky 2017
doi.org/10.1038/ng.3931
exercise
26s proteasomes are rapidly activated by diverse hormones and physiological states that raise camp and cause rpn6 phosphorylation
jordan j. s. verplank et al. 2019
doi.org/10.1073/pnas.1809254116
intense exercise, fasting and an array of hormones can activate cells' built-in protein disposal system and enhance their ability to purge defective, toxic or unneeded proteins.
The findings, published Feb. 19 in PNAS, reveal a previously unknown mechanism used by the body to rapidly turn on the molecular machinery responsible for junk-protein removal, allowing cells to adapt their protein content to meet new demands. This mechanism, the study shows, is triggered by fluctuations in hormone levels, which signal changes in physiologic conditions.
"Our findings show that the body has a built-in mechanism for cranking up the molecular machinery responsible for waste-protein removal that is so critical for the cells' ability to adapt to new conditions," said Alfred Goldberg, senior author on the study and professor of cell biology in the Blavatnik Institute at Harvard Medical School.
Cellular house cleaning in disease and health
Malfunctions in the cells' protein-disposal machinery can lead to the accumulation of misfolded proteins, which clog up the cell, interfere with its functions and, over time, precipitate the development of diseases, including neurodegenerative conditions such as ALS and Alzheimer's.
The best-studied biochemical system used by cells to remove junk proteins is the ubiquitin-proteasome pathway. It involves the tagging of defective or unneeded proteins with ubiquitin molecules -- a process known as the "kiss of death," which marks proteins for destruction by the cell's protein-disposal unit, known as 26S proteasome.
Past research by Goldberg's lab has shown that this machinery can be activated by pharmacological agents that boost the levels of a molecule known as cAMP, an intracellular messenger, which in turn switches on the enzyme protein kinase A.
The team's previous work has shown that cAMP-stimulating drugs enhance the destruction of defective or toxic proteins, particularly mutant proteins that can lead to neurodegenerative conditions.
The new findings, however, reveal that this quality-control process is continually regulated independent of drugs by shifts in physiological states and corresponding changes in hormones.
Past research, including work from Goldberg's lab, has focused predominantly on reining in overactive protein breakdown -- a state of excessive protein removal that can cause muscle wasting in cancer patients or give rise to several types of muscle atrophy. In fact, a proteasome inhibitor drug that tamps down the activity of the protein-disposal machinery, developed by Goldberg and team, has been widely used for the treatment of multiple myeloma, a common type of blood cancer, marked by abnormal protein accumulation and overworked proteasomes.
The team's latest work, by contrast, is focused on developing therapies that do the exact opposite -- invigorate the cell's protein-disposal machinery when it is too sluggish. These newest findings open the door -- at least conceptually -- to precisely such treatments.
"We believe our findings set the stage for the development of therapies that harness the cells' natural ability to dispose of proteins and thus enhance the removal of toxic proteins that cause disease," said study lead investigator Jordan VerPlank, a postdoctoral research fellow in cell biology in the Blavatnik Institute at Harvard Medical School.
Such treatments, the team said, may not necessarily involve the design of new molecules but instead stimulate the cell's built-in capacity for quality control.
"This is truly a new way of looking at whether we can turn up the cellular vacuum cleaner," Goldberg said. "We thought this would require the development of new types of molecules, but we hadn't truly appreciated that our cells continually activate this process.
"The beauty and the surprise of it is that such new treatments may involve churning a natural endogenous pathway and harnessing the body's preexisting capacity to perform quality control," he added.
That exercise has many salutary effects is already well known, the researchers said, but the new findings also hint at the possibility that exercise and fasting could help reduce the risk of developing conditions associated with the accumulation of misfolded proteins, such as Alzheimer's and Parkinson's. That possibility, however, remains to be explored in subsequent research, the team noted.
In their experiments, the researchers analyzed the effects of exercise on cells obtained from the thigh muscles of four human volunteers before and after vigorous biking. Following exercise, the proteasomes of these cells showed dramatically more molecular marks of enhanced protein degradation, including greater levels of cAMP, the chemical trigger that initiates the cascade that leads to protein degradation inside cells. The same changes were observed in the muscles of anesthetized rats whose hind legs were stimulated to contract repeatedly.
Fasting -- even for brief periods -- produced a similar effect on the cells' protein-breakdown machinery. Fasting increased proteasome activity in the muscle and liver cells of mice deprived of food for 12 hours -- the equivalent of an overnight fast.
In another round of experiments, the researchers exposed the liver cells of mice to glucagon -- the hormone that stimulates the production of glucose as fuel for cells and tissues during periods of food deprivation or whenever blood sugar levels drop down. The researchers observed that glucagon exposure stimulated proteasome activity and enhanced the cells' capacity to destroy misfolded proteins.
Exposure to the fight-or-flight hormone epinephrine produced a similar effect. Epinephrine, or adrenaline in common parlance, is responsible for stimulating the liver and muscle to mobilize energy reserves to boost heart rate and muscle strength during periods of physiologic stress. Liver cells treated with epinephrine showed marked increases in cAMP, as well as enhanced 26S proteasome activity and protein degradation. Epinephrine exposure also boosted proteasome activity -- a marker of protein degradation -- in the working hearts of rats. Similarly, when researchers exposed the kidney cells of mice to vasopressin -- the antidiuretic hormone that helps the body retain water and prevents dehydration -- they observed higher levels of protein degradation as well.
Taken together, these findings demonstrate that the rate of protein degradation can rise and fall swiftly in a variety of tissues in response to shifting conditions and that such changes are mediated by fluctuations in hormone levels. This response was also surprisingly rapid and short-lived, the scientists noted. For example, exposure to the antidiuretic hormone triggered protein breakdown in kidney cells within five minutes and subsided to pre-exposure levels within an hour, the experiments showed.
The findings show that a diverse set of hormones that stimulate the intracellular messenger cAMP appear to share a common mechanism that alters the composition of cells. cAMP-stimulating hormones have long been known to modify gene expression, but this latest research reveals they also play a critical role in cellular "house cleaning" by disposing of proteins that are no longer needed.
A new twist on a classic concept
Even the most mundane of activities -- eating, sleeping, exercise -- require the cells in our body to modulate their composition minute by minute in order to cope with new demands, all in the name of maintaining proper cellular function and averting harm. The new research reveals that some of these protective shifts occur in our cells' protein-disposal system, where misfolded or unneeded proteins are removed promptly and new ones in demand are synthesized swiftly.
The new findings build on observations about the physiologic effects of hormones first made by Harvard Medical School physician Walter Cannon nearly a century ago and elegantly captured in his book The Wisdom of the Body (1932). Some of Cannon's most notable work includes defining the mechanism of action of the hormone epinephrine and its role in the body's fight-or-flight response -- a key survival mechanism marked by a cascade of physiologic changes during times of high stress.
Epinephrine is one of the hormones whose action on the cells' protein-disposal machinery is now illuminated by Goldberg's latest work. In a twist of symbolic coincidence, Goldberg's lab occupies the very space where Cannon made his historic observations on the same hormone a hundred years ago.
"We think ours is truly a neoclassical discovery that builds on findings and observations made right here, in this very building, nearly a century ago," Goldberg said.
abstract Most studies of proteolysis by the ubiquitin-proteasome pathway have focused on the regulation by ubiquitination. However, we showed that pharmacological agents that raise cAMP and activate protein kinase A by phosphorylating a proteasome subunit enhance proteasome activity and the cell’s capacity to selectively degrade misfolded and regulatory proteins. We investigated whether similar adaptations occur in physiological conditions where cAMP rises. Proteasome activity increases by this mechanism in human muscles following intense exercise, in mouse muscles and liver after a brief fast, in hepatocytes after epinephrine or glucagon, and renal collecting duct cells within 5 minutes of antidiuretic hormone. Thus, hormones and conditions that raise cAMP rapidly enhance proteasome activity and the cells’ capacity to eliminate damaged and preexistent regulatory proteins.
Abstract
Pharmacological agents that raise cAMP and activate protein kinase A (PKA) stimulate 26S proteasome activity, phosphorylation of subunit Rpn6, and intracellular degradation of misfolded proteins. We investigated whether a similar proteasome activation occurs in response to hormones and under various physiological conditions that raise cAMP. Treatment of mouse hepatocytes with glucagon, epinephrine, or forskolin stimulated Rpn6 phosphorylation and the 26S proteasomes’ capacity to degrade ubiquitinated proteins and peptides. These agents promoted the selective degradation of short-lived proteins, which are misfolded and regulatory proteins, but not the bulk of cell proteins or lysosomal proteolysis. Proteasome activities and Rpn6 phosphorylation increased similarly in working hearts upon epinephrine treatment, in skeletal muscles of exercising humans, and in electrically stimulated rat muscles. In WT mouse kidney cells, but not in cells lacking PKA, treatment with antidiuretic hormone (vasopressin) stimulated within 5-minutes proteasomal activity, Rpn6 phosphorylation, and the selective degradation of short-lived cell proteins. In livers and muscles of mice fasted for 12–48 hours cAMP levels, Rpn6 phosphorylation, and proteasomal activities increased without any change in proteasomal content. Thus, in vivo cAMP-PKA–mediated proteasome activation is a common cellular response to diverse endocrine stimuli and rapidly enhances the capacity of target tissues to degrade regulatory and misfolded proteins (e.g., proteins damaged upon exercise). The increased destruction of preexistent regulatory proteins may help cells adapt their protein composition to new physiological conditions.
high-intensity workouts won’t work for most people
sciencedaily.com/releases/2017/10/171003124821.htm
mechanical skin injury promotes food anaphylaxis by driving intestinal mast cell expansion
juan-manuel leyva-castillo et al. 2019
doi.org/10.1016/j.immuni.2019.03.023
Atopic dermatitis is a strong risk factor for developing food allergy, but the precise relationship between the two conditions remains unclear. As itching is a major symptom of atopic dermatitis, people with the disease, particularly babies, often scratch their skin. The current study proposes that scratching the skin instigates mast-cell expansion in the intestine.
The researchers found that some cells in the skin respond to scratching -- simulated by applying and removing small strips of tape on the skin of mice -- by producing a cell-signaling protein called IL-33, which enters the bloodstream. When IL-33 reaches the gut, it works in concert with IL-25, a protein secreted by cells in the lining of the intestine, to activate type 2 innate lymphoid cells (ILC2s). Activated ILC2s make two additional cell-signaling proteins, IL-13 and IL-4, which were found to be responsible for the expansion of intestinal mast cells.
The researchers also found that as mast cells expanded, the intestinal lining became more permeable, making it easier for allergens to enter the tissues. Notably, mice that underwent tape stripping had more severe reactions to food allergen than mice that did not. Finally, the researchers found that intestinal biopsies from four children with atopic dermatitis contained more mast cells than those from four children without the condition.
Although additional work is needed to determine the relevance of the findings to humans, the researchers suggest that interventions to limit itching potentially could lessen the severity of food allergy among people with atopic dermatitis.
effects of transcutaneous vagus nerve stimulation in individuals aged 55 years or above: potential benefits of daily stimulation
beatrice bretherton et al. 2019
doi.org/10.18632/aging.102074
a short daily therapy delivered for two weeks led to both physiological and wellbeing improvements, including a better quality of life, mood and sleep.
The therapy, called transcutaneous vagus nerve stimulation, delivers a small, painless electrical current to the ear, which sends signals to the body's nervous system through the vagus nerve.
The new research, conducted at the University of Leeds, suggests the therapy may slow down an important effect associated with ageing.
This could help protect people from chronic diseases which we become more prone to as we get older, such as high blood pressure, heart disease and atrial fibrillation. The researchers, who published their findings today in the journal Aging, suggest that the 'tickle' therapy has the potential to help people age more healthily, by recalibrating the body's internal control system.
Lead author Dr Beatrice Bretherton, from the School of Biomedical Sciences at the University of Leeds, said: "The ear is like a gateway through which we can tinker with the body's metabolic balance, without the need for medication or invasive procedures. We believe these results are just the tip of the iceberg.
"We are excited to investigate further into the effects and potential long-term benefits of daily ear stimulation, as we have seen a great response to the treatment so far."
The study was conducted by scientists from the University of Leeds and funded by the Dunhill Medical Trust.
What is the autonomic nervous system?
The autonomic nervous system controls many of the body's functions which don't require conscious thought, such as digestion, breathing, heart rate and blood pressure.
It contains two branches, the sympathetic and the parasympathetic, which work against each other to maintain a healthy balance of activity.
The sympathetic branch helps the body prepare for high intensity 'fight or flight' activity, whilst the parasympathetic is crucial to low intensity 'rest and digest' activity.
As we age, and when we are fighting diseases, the body's balance changes such that the sympathetic branch begins to dominate. This imbalance makes us more susceptible to new diseases and leads to the breakdown of healthy bodily function as we get older.
Clinicians have long been interested in the potential for using electrical currents to influence the nervous system. The vagus nerve, the major nerve of the parasympathetic system, has often been used for electrical stimulation and past research has looked at the possibility of using vagus nerve stimulation to tackle depression, epilepsy, obesity, stroke, tinnitus and heart conditions.
However, this kind of stimulation needs surgery to implant electrodes in the neck region, with associated expense and a small risks of side effects.
Fortunately, there is one small branch of the vagus nerve that can be stimulated without surgery, located in the skin of specific parts of the outer ear.
In Leeds, previous research has shown that applying a small electrical stimulus to the vagus nerve at the ear, which some people perceive as a tickling sensation, improves the balance of the autonomic nervous system in healthy 30-year-olds.
Other researchers worldwide are now investigating if this transcutaneous vagus nerve stimulation (tVNS) could provide a therapy for conditions ranging from heart problems to mental health.
Diane Crossley, aged 70, from Leeds, took part in the study and received the tVNS therapy for two weeks. She said: "I was happy to be a participant in this really interesting study, it helped me with my awareness of my own health.
"It was a fascinating project and I was proud to be part of it."
In their new study, scientists at the University of Leeds wanted to see whether tVNS could benefit over 55-year-olds, who are more likely to have out-of-balance autonomic systems that could contribute to health issues associated with ageing.
They recruited 29 healthy volunteers, aged 55 or above, and gave each of them the tVNS therapy for 15 minutes per day, over a two week period. Participants were taught to self-administer the therapy at home during the study.
The therapy led to an increase in parasympathetic activity and a decrease in sympathetic activity, rebalancing the autonomic function towards that associated with healthy function. In addition, some people reported improvements in measures of mental health and sleeping patterns.
Being able to correct this balance of activity could help us age more healthily, as well as having the potential to help people with a variety of disorders such as heart disease and some mental health issues.
Additionally, improving the balance of the autonomic nervous system lowers an individual's risk of death, as well as the need for medication or hospital visits.
Researchers found that individuals who displayed the greatest imbalance at the start of the study experienced the most pronounced improvements after receiving the therapy.
They suggest that in future it may be possible to identify who is most likely to benefit from the therapy, so it can be offered through a targeted approach.
tVNS therapy has previously been shown to have positive psychological effects for patients with depression, and this study shows it could also have significant physiological benefits.
abstract Ageing is associated with attenuated autonomic function. Transcutaneous vagal nerve stimulation (tVNS) improved autonomic function in healthy young participants. We therefore investigated the effects of a single session of tVNS (studies 1 and 2) and tVNS administered daily for two weeks (study 3) in volunteers aged ≥ 55 years. tVNS was performed using modified surface electrodes on the tragus and connected to a transcutaneous electrical nerve stimulation (TENS) machine. Study 1: participants (n=14) received a single session of tVNS and sham. Study 2: all participants (n=51) underwent a single session of tVNS. Study 3: participants (n=29) received daily tVNS for two weeks. Heart rate variability and baroreflex sensitivity were derived. Quality of life (QoL), mood and sleep were assessed in study 3. tVNS promoted increases in measures of vagal tone and was associated with greater increases in baroreflex sensitivity than sham. Two weeks of daily tVNS improved measures of autonomic function, and some aspects of QoL, mood and sleep. Importantly, findings showed that improvements in measures of autonomic balance were more pronounced in participants with greater baseline sympathetic prevalence. This suggests it may be possible to identify individuals who are likely to encounter significant benefits from tVNS.
sestrins are evolutionarily conserved mediators of exercise benefits
myungjin kim et al. 2020
doi.org/10.1038/s41467-019-13442-5
“Researchers have previously observed that Sestrin accumulates in muscle following exercise,” said Myungjin Kim, Ph.D., a research assistant professor in the Department of Molecular & Integrative Physiology. Kim, working with professor Jun Hee Lee, Ph.D. and a team of researchers wanted to know more about the protein’s apparent link to exercise. Their first step was to encourage a bunch of flies to work out.
Taking advantage of Drosophila flies’ normal instinct to climb up and out of a test tube, their collaborators Robert Wessells, Ph.D. and Alyson Sujkowski of Wayne State University in Detroit developed a type of fly treadmill. Using it, the team trained the flies for three weeks and compared the running and flying ability of normal flies with that of flies bred to lack the ability to make Sestrin.
“Flies can usually run around four to six hours at this point and the normal flies’ abilities improved over that period,” says Lee. “The flies without Sestrin did not improve with exercise.”
What’s more, when they overexpressed Sestrin in the muscles of normal flies, essentially maxing out their Sestrin levels, they found those flies had abilities above and beyond the trained flies, even without exercise. In fact, flies with overexpressed Sestrin didn’t develop more endurance when exercised.
The beneficial effects of Sestrin include more than just improved endurance. Mice without Sestrin lacked the improved aerobic capacity, improved respiration and fat burning typically associated with exercise.
“We propose that Sestrin can coordinate these biological activities by turning on or off different metabolic pathways,” says Lee. “This kind of combined effect is important for producing exercise’s effects.”
Lee also helped another collaborator, Pura Muñoz-Cánoves, Ph.D., of Pompeu Fabra University in Spain, to demonstrate that muscle-specific Sestrin can also help prevent atrophy in a muscle that’s immobilized, such as the type that occurs when a limb is in a cast for a long period of time. “This independent study again highlights that Sestrin alone is sufficient to produce many benefits of physical movement and exercise,” says Lee.
Could Sestrin supplements be on the horizon? Not quite, says Lee. “Sestrins are not small molecules, but we are working to find small molecule modulators of Sestrin.”
Additionally, adds Kim, scientists still don’t know how exercise produces Sestrin in the body. “This is very critical for future study and could lead to a treatment for people who cannot exercise.”
abstract Exercise is among the most effective interventions for age-associated mobility decline and metabolic dysregulation. Although long-term endurance exercise promotes insulin sensitivity and expands respiratory capacity, genetic components and pathways mediating the metabolic benefits of exercise have remained elusive. Here, we show that Sestrins, a family of evolutionarily conserved exercise-inducible proteins, are critical mediators of exercise benefits. In both fly and mouse models, genetic ablation of Sestrins prevents organisms from acquiring metabolic benefits of exercise and improving their endurance through training. Conversely, Sestrin upregulation mimics both molecular and physiological effects of exercise, suggesting that it could be a major effector of exercise metabolism. Among the various targets modulated by Sestrin in response to exercise, AKT and PGC1α are critical for the Sestrin effects in extending endurance. These results indicate that Sestrin is a key integrating factor that drives the benefits of chronic exercise to metabolism and physical endurance.
the rachitic tooth: the use of radiographs as a screening technique
lori d’ortenzio et al. 2017
doi.org/10.1016/j.ijpp.2017.10.001
tuberculosis mycobacterium exploiting our anti–viral immune system
mycobacterium tuberculosis–induced ifn-β production requires cytosolic dna and rna sensing pathways
yong cheng, jeffrey s. schorey et al. 2018
doi.org/10.1084/jem.20180508
As part of the study the researchers found that mice lacking a key protein required for responding to foreign RNA and therefore required for interferon beta production were better able to control the MTB infection. The discovery was a surprise to the researchers, as interferon beta is essential to controlling several viral infections.
"The results suggest that our immune response to mycobacterial RNA is beneficial for the pathogen and bad for the host. It's the total opposite of viral infections," said Jeff Schorey, George B. Craig Jr. Professor in the Department of Biological Sciences at Notre Dame and co-author of the study. "This study gives us a better understanding of how the mycobacteria causes disease and what makes it the most successful pathogen in human history."
MTB infections cause a battle between the immune response and the ability of the bacteria to circumvent that response -- who wins the battle determines the body's ability to control the infection. Schorey and Yong Cheng, a research assistant professor at Notre Dame, set out to determine how mycobacteria RNA could be affecting the host response. What they found was that by releasing RNA, the bacteria set off a chain reaction inside the macrophage, a cell type of the immune system -- resulting in a mechanism that benefits the survival of MTB through the production of interferon beta.
While researchers have long known that bacteria produce proteins and other compounds to modulate an immune response, such a role for mycobacterial nucleic acids has only recently been defined. In viral infections, as opposed to bacterial infections, the virus releases its nucleic acids as it needs the machinery of the host cell to help make viral proteins and replicate its genome. In contrast, bacteria already have the machinery for these processes in place, suggesting the release of RNA into the host cell is intentional. The authors found that the MTB use its secretion system known as SecA2 to mediate RNA release from the mycobacteria.
"Bacteria have everything they need to make their proteins, so the fact that they were releasing nucleic acids was a surprise," Schorey said. "These bugs are using this RNA-sensing pathway, which has evolved to promote antiviral activity -- so in other words, the bacteria are manipulating our own immune system against us."
MTB is the No. 1 cause of death by an infectious organism, and kills up to 1.8 million people each year. The World Health Organization estimates 200,000 of those deaths are children. Health officials lack an effective vaccine against pulmonary tuberculosis, and antibiotics used to treat the disease must be taken for six to nine months -- a daunting regimen that challenges patient compliance. The disease is prevalent in parts of the world where health care systems lack infrastructure and funding.
Despite those challenges, Schorey, an affiliated faculty member at Notre Dame's Eck Institute for Global Health, said the study's results show potential for the development of immunotherapies to selectively stimulate protective immune responses as a treatment option for MTB and other bacterial infectious diseases
abstract RNA sensing pathways are key elements in a host immune response to viral pathogens, but little is known of their importance during bacterial infections. We found that Mycobacterium tuberculosis (M.tb) actively releases RNA into the macrophage cytosol using the mycobacterial SecA2 and ESX-1 secretion systems. The cytosolic M.tb RNA induces IFN-β production through the host RIG-I/MAVS/IRF7 RNA sensing pathway. The inducible expression of IRF7 within infected cells requires an autocrine signaling through IFN-β and its receptor, and this early IFN-β production is dependent on STING and IRF3 activation. M.tb infection studies using Mavs−/− mice support a role for RNA sensors in regulating IFN-β production and bacterial replication in vivo. Together, our data indicate that M.tb RNA is actively released during an infection and promotes IFN-β production through a regulatory mechanism involving cross-talk between DNA and RNA sensor pathways, and our data support the hypothesis that bacterial RNA can drive a host immune response.
exposure to magnetic field non-ionizing radiation and the risk of miscarriage: a prospective cohort study
de-kun li et al. 2017
doi.org/10.1038/s41598-017-16623-8
association of modifiable risk factors in young adulthood with racial disparity in incident type 2 diabetes during middle adulthood
michael p. bancks et al. 2017
doi.org/10.1001/jama.2017.19546
destructive disinfection of infected brood prevents systemic disease spread in ant colonies
christopher d pull et al. 2017
doi.org/10.7554/elife.32073
antithyroid drugs and congenital malformations
gi hyeon seo et al. 2018
doi.org/10.7326/m17-1398
note carbimazole (sold under brand name vidalta) after absorbtion is converted to the active form, methimazole.
elevated levels of the reactive metabolite methylglyoxal recapitulate progression of type 2 diabetes
alexandra moraru et al. 2018
doi.org/10.1016/j.cmet.2018.02.003
sodium bicarbonate NaHCO3
oral NaHCO3 activates a splenic anti-inflammatory pathway: evidence that cholinergic signals are transmitted via mesothelial cells
sarah c. ray et al. 2018
doi.org/10.4049/jimmunol.1701605
acid suspends the circadian clock in hypoxia through inhibition of mtor
zandra e. walton et al. 2018
doi.org/10.1016/j.cell.2018.05.009
•Metabolic adaptation to hypoxia elevates acid production
•Low pH suppresses oscillation of the molecular clock and circadian transcriptome
•Acid scatters lysosomes, thereby silencing mTORC1 through separation from RHEB
•mTORC1 inhibition by acid dampens clock network translation and collapses the clock
Recent reports indicate that hypoxia influences the circadian clock through the transcriptional activities of hypoxia-inducible factors (HIFs) at clock genes. Unexpectedly, we uncover a profound disruption of the circadian clock and diurnal transcriptome when hypoxic cells are permitted to acidify to recapitulate the tumor microenvironment. Buffering against acidification or inhibiting lactic acid production fully rescues circadian oscillation. Acidification of several human and murine cell lines, as well as primary murine T cells, suppresses mechanistic target of rapamycin complex 1 (mTORC1) signaling, a key regulator of translation in response to metabolic status. We find that acid drives peripheral redistribution of normally perinuclear lysosomes away from perinuclear RHEB, thereby inhibiting the activity of lysosome-bound mTOR. Restoring mTORC1 signaling and the translation it governs rescues clock oscillation. Our findings thus reveal a model in which acid produced during the cellular metabolic response to hypoxia suppresses the circadian clock through diminished translation of clock constituents.
alzheimers
corroboration of a major role for herpes simplex virus type 1 in alzheimer’s disease
ruth f. itzhaki 2018
doi.org/10.3389/fnagi.2018.00324
“It is worth stressing though that these data and the preceding evidence for a role of HSV1 in AD, do not preclude a role for bacteria, in particular, Borrelia, Chlamydia pneumoniae, and some oral bacteria, which are probably the microbes most strongly implicated in AD (see review, Miklossy and McGeer, 2016): one or more such microbes might be involved, leading to the disease in the sizeable proportion of AD patients whose illness is not accounted for by HSV1 (in combination with APOE-ε4).”
“As to the effects of treating AD patients long term, ACV causes few side-effects except in renally impaired patients; these should therefore be excluded from relevant trials”
association of chronic low-grade inflammation with risk of alzheimer disease in apoe4 carriers
qiushan tao et al. 2018
doi.org/10.1001/jamanetworkopen.2018.3597
lifespan changes of the human brain in alzheimer’s disease
pierrick coupé et al. 2019
doi.org/10.1038/s41598-019-39809-8
model which traces brain changes and activity over an entire life span using a massive set of over 4,000 MRI scans processed with the volBrain1 platform. Prior to this, scientists did not have images covering every period of an Alzheimer patient's life. Researchers suggested modelling the changes generally seen in the volumes of different structures using a vast quantity of samples in order to pinpoint where healthy brains diverged from diseased ones over time.
Based on MRI scans from 2,944 healthy control subjects between the ages of 9 months to 94 years old, the team developed a 'normal' model of average brain changes, which they compared to a pathological model based on MRIs from 1,385 Alzheimer's patients aged over 55 and 1,877 young control subjects. Their results show an early divergence between the pathological models and the normal trajectory of ageing of the hippocampus before age 40, and of the amygdala around age 40. Both of these structures suffer atrophy in the presence of Alzheimer's disease. Also evident is an early enlargement, in patients with the disease, of an internal cavity in the brain known as the lateral ventricle. This enlargement is part of the ageing process in normal subjects too, however, thus limiting the pertinence of this measurement in subjects of an advanced age, and reaffirming the usefulness of studying biomarkers across an entire life span
abstract Brain imaging studies have shown that slow and progressive cerebral atrophy characterized the development of Alzheimer’s Disease (AD). Despite a large number of studies dedicated to AD, key questions about the lifespan evolution of AD biomarkers remain open. When does the AD model diverge from the normal aging model? What is the lifespan trajectory of imaging biomarkers for AD? How do the trajectories of biomarkers in AD differ from normal aging? To answer these questions, we proposed an innovative way by inferring brain structure model across the entire lifespan using a massive number of MRI (N = 4329). We compared the normal model based on 2944 control subjects with the pathological model based on 3262 patients (AD + Mild cognitive Impaired subjects) older than 55 years and controls younger than 55 years. Our study provides evidences of early divergence of the AD models from the normal aging trajectory before 40 years for the hippocampus, followed by the lateral ventricles and the amygdala around 40 years. Moreover, our lifespan model reveals the evolution of these biomarkers and suggests close abnormality evolution for the hippocampus and the amygdala, whereas trajectory of ventricular enlargement appears to follow an inverted U-shape. Finally, our models indicate that medial temporal lobe atrophy and ventricular enlargement are two mid-life physiopathological events characterizing AD brain.
macrophage migration inhibitory factor is subjected to glucose modification and oxidation in alzheimer’s disease
omar kassaar et al. 2017
doi.org/10.1038/srep42874
spontaneous isomerization of long-lived proteins provides a molecular mechanism for the lysosomal failure observed in alzheimer’s disease
tyler r. lambeth et al. 2019
doi.org/10.1021/acscentsci.9b00369
"The dominant theory based on beta-amyloid buildup has been around for decades, and dozens of clinical trials based on that theory have been attempted, but all have failed," said Ryan R. Julian, a professor of chemistry who led the research team. "In addition to plaques, lysosomal storage is observed in brains of people who have Alzheimer's disease. Neurons -- fragile cells that do not undergo cell division -- are susceptible to lysosomal problems, specifically, lysosomal storage, which we report is a likely cause of Alzheimer's disease."
Study results appear in ACS Central Science, a journal of the American Chemical Society.
An organelle within the cell, the lysosome serves as the cell's trashcan. Old proteins and lipids get sent to the lysosome to be broken down to their building blocks, which are then shipped back out to the cell to be built into new proteins and lipids. To maintain functionality, the synthesis of proteins is balanced by the degradation of proteins.
The lysosome, however, has a weakness: If what enters does not get broken down into little pieces, then those pieces also can't leave the lysosome. The cell decides the lysosome is not working and "stores" it, meaning the cell pushes the lysosome to the side and proceeds to make a new one. If the new lysosome also fails, the process is repeated, resulting in lysosome storage.
"The brains of people who have lysosomal storage disorder, another well-studied disease, and the brains of people who have Alzheimer's disease are similar in terms of lysosomal storage," Julian said. "But lysosomal storage disorder symptoms show up within a few weeks after birth and are often fatal within a couple of years. Alzheimer's disease occurs much later in life. The time frames are, therefore, very different."
Julian's collaborative team of researchers in the Department of Chemistry and the Division of Biomedical Sciences at UC Riverside posits that long-lived proteins can undergo spontaneous modifications that can make them undigestible by the lysosomes.
"Long-lived proteins become more problematic as we age and could account for the lysosomal storage seen in Alzheimer's, an age-related disease," Julian said. "If we are correct, it would open up new avenues for treatment and prevention of this disease."
He explained that the changes occur in the fundamental structure of the amino acids that make up the proteins and is the equivalent of flipping the handedness of the amino acids, with amino acids spontaneously acquiring the mirror images of their original structures.
"Enzymes that ordinarily break down the protein are then not able to do so because they are unable to latch onto the protein," Julian added. "It's like trying to fit a left-handed glove on your right hand. We show in our paper that this structural modification can happen in beta-amyloid and tau, proteins relevant to Alzheimer's disease. These proteins undergo this chemistry that is almost invisible, which may explain why researchers have not paid attention to it."
Julian explained these spontaneous changes in protein structure are a function of time, taking place if the protein hangs around for too long.
"It's been long known that these modifications happen in long-lived proteins, but no one has ever looked at whether these modifications could prevent the lysosomes from being able to break down the proteins," he said. "One way to prevent this would be to recycle the proteins so that they are not sitting around long enough to go through these chemical modifications. Currently, no drugs are available to stimulate this recycling -- a process called autophagy -- for Alzheimer's disease treatment."
The research was done in the lab on living cells provided by Byron D. Ford, a professor of biomedical sciences in the School of Medicine. The findings could have implications for other age-related diseases such as macular degeneration and cardiac diseases linked to lysosomal pathology.
abstract Proteinaceous aggregation is a well-known observable in Alzheimer’s disease (AD), but failure and storage of lysosomal bodies within neurons is equally ubiquitous and actually precedes bulk accumulation of extracellular amyloid plaque. In fact, AD shares many similarities with certain lysosomal storage disorders though establishing a biochemical connection has proven difficult. Herein, we demonstrate that isomerization and epimerization, which are spontaneous chemical modifications that occur in long-lived proteins, prevent digestion by the proteases in the lysosome (namely, the cathepsins). For example, isomerization of aspartic acid into l-isoAsp prevents digestion of the N-terminal portion of Aβ by cathepsin L, one of the most aggressive lysosomal proteases. Similar results were obtained after examination of various target peptides with a full series of cathepsins, including endo-, amino-, and carboxy-peptidases. In all cases peptide fragments too long for transporter recognition or release from the lysosome persisted after treatment, providing a mechanism for eventual lysosomal storage and bridging the gap between AD and lysosomal storage disorders. Additional experiments with microglial cells confirmed that isomerization disrupts proteolysis in active lysosomes. These results are easily rationalized in terms of protease active sites, which are engineered to precisely orient the peptide backbone and cannot accommodate the backbone shift caused by isoaspartic acid or side chain dislocation resulting from epimerization. Although Aβ is known to be isomerized and epimerized in plaques present in AD brains, we further establish that the rates of modification for aspartic acid in positions 1 and 7 are fast and could accrue prior to plaque formation. Spontaneous chemistry can therefore provide modified substrates capable of inducing gradual lysosomal failure, which may play an important role in the cascade of events leading to the disrupted proteostasis, amyloid formation, and tauopathies associated with AD.
in vivo detection of cerebral tau pathology in long-term survivors of traumatic brain injury
nikos gorgoraptis et al. 2019
doi.org/10.1126/scitranslmed.aaw1993
In the early-stage study, researchers studied 21 patients who had suffered a moderate to severe head injury at least 18 years earlier (mostly from traffic accidents), as well as 11 healthy individuals who had not experienced a head injury.
The research, from scientists at Imperial's Dementia Research Institute as well as the University of Glasgow, showed some of these patients had clumps of protein in their brain called tau tangles.
The team, who recruited patients from the Institute of Health and Wellbeing at the University of Glasgow and from Imperial College Healthcare NHS Trust, say the research may accelerate the development of treatments that breakdown tau tangles, by enabling medics to monitor the amount of the protein.
Tau normally helps provide structural support to nerve cells in the brain - acting as a type of scaffolding, but when brain cells become damaged - for instance during a head injury, the protein may form clumps, or tangles.
Tau tangles are found in Alzheimer's disease and other forms of dementia, and associated with progressive nerve damage.
Scientists have known for some time that repeated head injury - such as those sustained in sports such as boxing, rugby and American Football - can lead to neurodegeneration and dementia in later life - with particularly strong links to a type of brain condition called chronic traumatic encephalopathy.
However, this is the first time scientists have seen the protein tangles in living patients who have suffered a single, severe head injury, explains Dr Nikos Gorgoraptis, author of the paper from Imperial's Department of Brain Sciences.
"Scientists increasingly realise that head injuries have a lasting legacy in the brain - and can continue to cause damage decades after the initial injury. However, up until now most of the research has focussed on the people who have sustained multiple head injuries, such as boxers and American Football players. This is the first time we have seen in these protein tangles in patients who have sustained a single head injury."
Dr Gorgoraptis adds that although these tangles have been detected in the brains of patients in post-mortem examination - where findings suggest around one in three patients with a single head injury develop protein tangles - they have not before been seen in the brains of living patients.
The study used a type of brain scan, called a PET scan, combined with a substance that binds to tau protein, called flortaucipir, to study the amount of tau protein in the brains of head injury patients.
The results revealed that, collectively, patients with head injury were more likely to have tau tangles. The paper also showed that patients with tau tangles had higher levels of nerve damage, particular in the white matter of the brain. None of the healthy individuals had tau tangles.
Interestingly, the results revealed patients with higher levels of tau tangles did not necessarily have any reduction in brain function, such as memory problems, compared to patients with fewer tangles.
However, Dr Gorgoraptis adds these tangles can develop years before a person starts to develop symptoms such as memory loss. He explained there are still many questions to answer about the tau tangles and brain damage.
"This research adds a further piece in the puzzle of head injury and the risk of neurodegeneration. Not all patients with head injury develop these protein tangles, and some patients can have them for many years without developing symptoms. While we know tau tangles are associated with Alzheimer's and other forms of dementia, we are only beginning to understand how brain trauma might lead to their formation. What is exciting about this study is this is the first step towards a scan that can give a clear indication of how much tau is in the brain, and where it is located. As treatments develop over the coming years that might target tau tangles, these scans will help doctors select the patients who may benefit and monitor the effectiveness of these treatments."
abstract Traumatic brain injury (TBI) can trigger progressive neurodegeneration, with tau pathology seen years after a single moderate-severe TBI. Identifying this type of posttraumatic pathology in vivo might help to understand the role of tau pathology in TBI pathophysiology. We used flortaucipir positron emission tomography (PET) to investigate whether tau pathology is present many years after a single TBI in humans. We examined PET data in relation to markers of neurodegeneration in the cerebrospinal fluid (CSF), structural magnetic resonance imaging measures, and cognitive performance. Cerebral flortaucipir binding was variable, with many participants with TBI showing increases in cortical and white matter regions. At the group level, flortaucipir binding was increased in the right occipital cortex in TBI when compared to healthy controls. Flortaucipir binding was associated with increased total tau, phosphorylated tau, and ubiquitin carboxyl-terminal hydrolase L1 CSF concentrations, as well as with reduced fractional anisotropy and white matter tissue density in TBI. Apolipoprotein E (APOE) ε4 genotype affected the relationship between flortaucipir binding and time since injury, CSF β amyloid 1–42 (Aβ42) concentration, white matter tissue density, and longitudinal Mini-Mental State Examination scores in TBI. The results demonstrate that tau PET is a promising approach to investigating progressive neurodegeneration associated with tauopathy after TBI.
sounds like a healthy retail atmospheric strategy: effects of ambient music and background noise on food sales
dipayan biswas et al. 2018
doi.org/10.1007/s11747-018-0583-8
laterally confined growth of cells induces nuclear reprogramming in the absence of exogenous biochemical factors
bibhas roy et al. 2018
doi.org/10.1073/pnas.1714770115
gene-by-environment interactions in urban populations modulate risk phenotypes
marie-julie favé et al. 2018
doi.org/10.1038/s41467-018-03202-2
mean air temperature as a risk factor for stroke mortality in são paulo, brazil
priscilla v. ikefuti et al. 2018
doi.org/10.1007/s00484-018-1554-y
airway
childhood immune imprinting to influenza a shapes birth year-specific risk during seasonal h1n1 and h3n2 epidemics
katelyn m. gostic et al. 2020
doi.org/10.1371/journal.ppat.1008109
reported in 2016 that exposure to influenza viruses during childhood gives people partial protection for the rest of their lives against distantly related influenza viruses. Biologists call the idea that past exposure to the flu virus determines a person’s future response to infections “immunological imprinting.”
The 2016 research helped overturn a commonly held belief that previous exposure to a flu virus conferred little or no immunological protection against strains that can jump from animals into humans, such as those causing the strains known as swine flu or bird flu. Those strains, which have caused hundreds of spillover cases of severe illness and death in humans, are of global concern because they could gain mutations that allow them to readily jump not only from animal populations to humans, but also to spread rapidly from person to person.
In the new study, the researchers investigated whether immunological imprinting could explain people’s response to flu strains already circulating in the human population and to what extent it could account for observed discrepancies in how severely the seasonal flu affects people in different age groups.
To track how different strains of the flu virus affect people at different ages, the team analyzed health records that the Arizona Department of Health Services obtains from hospitals and private physicians.
Two subtypes of influenza virus, H3N2 and H1N1, have been responsible for seasonal outbreaks of the flu over the past several decades. H3N2 causes the majority of severe cases in high-risk elderly people and the majority of deaths from the flu. H1N1 is more likely to affect young and middle-aged adults, and causes fewer deaths.
The health record data revealed a pattern: People first exposed to the less severe strain, H1N1, during childhood were less likely to end up hospitalized if they encountered H1N1 again later in life than people who were first exposed to H3N2. And people first exposed to H3N2 received extra protection against H3N2 later in life.
The researchers also analyzed the evolutionary relationships between the flu strains. H1N1 and H3N2, they learned, belong to two separate branches on the influenza “family tree,” said James Lloyd-Smith, a UCLA professor of ecology and evolutionary biology and one of the study’s senior authors. While infection with one does result in the immune system being better prepared to fight a future infection from the other, protection against future infections is much stronger when one is exposed to strains from the same group one has battled before, he said.
The records also revealed another pattern: People whose first childhood exposure was to H2N2, a close cousin of H1N1, did not have a protective advantage when they later encountered H1N1. That phenomenon was much more difficult to explain, because the two subtypes are in the same group, and the researchers’ earlier work showed that exposure to one can, in some cases, grant considerable protection against the other.
“Our immune system often struggles to recognize and defend against closely related strains of seasonal flu, even though these are essentially the genetic sisters and brothers of strains that circulated just a few years ago,” said lead author Katelyn Gostic, who was a UCLA doctoral student in Lloyd-Smith’s laboratory when the study was conducted and is now a postdoctoral fellow at the University of Chicago. “This is perplexing because our research on bird flu shows that deep in our immune memory, we have some ability to recognize and defend against the distantly related, genetic third cousins of the strains we saw as children.
“We hope that by studying differences in immunity against bird flus — where our immune system shows a natural ability to deploy broadly effective protection — and against seasonal flus — where our immune system seems to have bigger blind spots — we can uncover clues useful to universal influenza vaccine development.”
Around the world, influenza remains a major killer. The past two flu seasons have been more severe than expected, said Michael Worobey, a co-author of the study and head of the University of Arizona’s department of ecology and evolutionary biology. In the 2017-18 season, 80,000 people died in the U.S., more than in the swine flu pandemic of 2009, he said.
People who had their first bout of flu as children in 1955 — when the H1N1 was circulating but the H3N2 virus was not — were much more likely to be hospitalized with an H3N2 infection than an H1N1 infection last year, when both strains were circulating, Worobey said.
“The second subtype you’re exposed to is not able to create an immune response that is as protective and durable as the first,” he said.
abstract Across decades of co-circulation in humans, influenza A subtypes H1N1 and H3N2 have caused seasonal epidemics characterized by different age distributions of cases and mortality. H3N2 causes the majority of severe, clinically attended cases in high-risk elderly cohorts, and the majority of overall deaths, whereas H1N1 causes fewer deaths overall, and cases shifted towards young and middle-aged adults. These contrasting age profiles may result from differences in childhood imprinting to H1N1 and H3N2 or from differences in evolutionary rate between subtypes. Here we analyze a large epidemiological surveillance dataset to test whether childhood immune imprinting shapes seasonal influenza epidemiology, and if so, whether it acts primarily via homosubtypic immune memory or via broader, heterosubtypic memory. We also test the impact of evolutionary differences between influenza subtypes on age distributions of cases. Likelihood-based model comparison shows that narrow, within-subtype imprinting shapes seasonal influenza risk alongside age-specific risk factors. The data do not support a strong effect of evolutionary rate, or of broadly protective imprinting that acts across subtypes. Our findings emphasize that childhood exposures can imprint a lifelong immunological bias toward particular influenza subtypes, and that these cohort-specific biases shape epidemic age distributions. As a consequence, newer and less “senior” antibody responses acquired later in life do not provide the same strength of protection as responses imprinted in childhood. Finally, we project that the relatively low mortality burden of H1N1 may increase in the coming decades, as cohorts that lack H1N1-specific imprinting eventually reach old age.
protein crystallization promotes type 2 immunity and is reversible by antibody treatment
emma k. persson et al. 2019
doi.org/10.1126/science.aaw4295
In 1853, Jean-Martin Charcot at the renowned Salpêtrière Hospital in Paris reported detailed sketches of bipyramidal crystals that he had observed in the sputum of patients suffering from asthma, an observation also made by Ernst von Leyden in 1872. These crystalline deposits became widely known as Charcot-Leyden crystals (CLCs) in the medical world. Since then, they have been described in widespread chronic allergic and inflammatory diseases such as asthma, bronchitis, allergic rhinitis, and rhinosinusitis. It was, however, only during r the last couple of decades that the content of CLCs was confirmed as being made up of the protein galectin-10, finally settling speculations and debates that lasted for nearly a century and a half. Galectin-10 is one of the most abundant proteins in eosinophils, which help to mount an inflammatory response in humans. Surprisingly, Gal10 remains largely soluble in eosinophils and it only forms crystals once it has been released as part of an immunological defense. The function of Gal10 also remained elusive.
Do these crystals cause harm?
Spearheaded by Emma Persson, Kenneth Verstraete, and Ines Heyndrickx, the team of researchers set out to test a longstanding unresolved hypothesis: do CLCs stimulate immunity in the lung and contribute to excessive inflammatory responses leading to disease?
Prof. Bart Lambrecht (VIB-UGent): "Every medical doctor learns about Charcot-Leyden Crystals during medical training and everybody associates such crystals with the presence of eosinophils. They are very often found in the sputum of asthma patients, particularly in those patients with severe disease. Yet nobody really knew what these crystals were doing and why they are there in the first place. By analogy with the disease gout -where uric acid crystals cause a very painful attack of joint inflammation- we reasoned that Charcot-Leyden crystals might also cause harm in the lungs of asthma patients"
Crystals versus solution
There were a lot of technical challenges to overcome to test this idea. The scientists had to find a way to produce millions of crystals of Gal10 in the laboratory for research purposes and established these were identical to CLCs found in patients. The researchers used precious patient-derived crystals to determine the three-dimensional structure of Gal10 down to the atomic scale. This provided a 'holy-grail' kind of answer that confirmed that experimentally produced CLCs are identical to patient-derived CLC.
Prof. Savvas Savvides (VIB-UGent): "This is the first time in biochemical and medical history that patient-derived protein crystals are studied at atomic resolution. It is utterly remarkable that such microscopic crystals, which are merely a few micrometers in size (about a thousandth of a millimeter) survived the laborious and harsh experimental path that started in a hospital operation theater and ended at a specialized X-ray beamline of a European synchrotron radiation facility. And to top it off they yielded data that lead to a beautiful three-dimensional structure of the protein molecules inside them."
The researchers found that Gal10 induced a fully blown immune response only when it was in the crystalline state. In solution, Gal10 was harmless. Most importantly, crystalline Gal10 in the form of Charcot-Leyden crystals induced key features of asthma, including the production of altered mucus that is a big problem for most asthmatics. Thus, the study already delivered a major breakthrough with crystal clear conclusions.
Looking for a solution
The group then studied if interfering with CLC formation would be a therapeutic option for asthmatics. This is exactly where argenx, a Ghent-based biotechnology company stepped in. The combined teams developed antibodies that can react specifically against CLC. Remarkably, the antibodies were able to dissolve CLC within minutes in a petri dish in the lab and within a few hours in the mucus of patients (also in vitro). Use of these antibodies in mouse models of asthma lead to a strong reduction of lung inflammation, lung function alterations and mucus production.
Prof. Savvides: "It was like a 'now you see it, now you don't' display of molecular magic. I have spent 25 years learning and agonizing about how to grow protein crystals for structural biology, and all of a sudden, I was seeing protein crystals dissolve in real time! And to top it off we also got to visualize how these antibodies actually do their magic by determining their crystal structure in complex with their antigen!"
Prof. Bart Lambrecht: "Our research results were unexpected and crystal clear at the same time. I was completely struck by the fact that antibodies can rapidly dissolve CLCs that are so abundantly present in the native mucus of patients. Although more tests are needed, the data in mouse models suggest that use of these antibodies could be a very effective way of reducing excessive inflammation and mucus accumulation in the lungs of patients with asthma. Since there are no drugs currently targeting mucus accumulation in the airways, this could be a game changer for treating this disease."
abstract Charcot-Leyden crystals (CLCs) are formed from the eosinophil granule protein galectin-10 (Gal10) and found in severe eosinophil-associated diseases like asthma and chronic rhinosinusitis. Whether CLCs actively contribute to disease pathogenesis is unknown. Persson et al. found that lab-grown Gal10 crystals are biosimilar to CLCs (see the Perspective by Allen and Sutherland). When given to mice, the crystals acted as a type 2 adjuvant, mimicking many of the features of human asthma. In contrast, a Gal10 mutein unable to crystallize had no effect. Antibodies against epitopes crucial for Gal10 autocrystallization could dissolve both in vitro–generated Gal10 crystals and patient-derived CLCs. Furthermore, these anti-Gal10 antibodies reversed the effects of Gal10 crystals in a humanized mouse model of asthma, suggesting a potential therapeutic approach for crystallopathies more broadly.
anti-influenza activity of elderberry (sambucus nigra)
golnoosh torabian et al. 2019
doi.org/10.1016/j.jff.2019.01.031
compounds from elderberries can directly inhibit the virus's entry and replication in human cells, and can help strengthen a person's immune response to the virus.
Although elderberry's flu-fighting properties have long been observed, the group performed a comprehensive examination of the mechanism by which phytochemicals, compounds that positively effect health, from elderberries combat influenza infections.
"What our study has shown is that the common elderberry has a potent direct antiviral effect against the flu virus. It inhibits the early stages of an infection by blocking key viral proteins responsible for both the viral attachment and entry into the host cells," said Dr Golnoosh Torabian.
The researchers used commercially farmed elderberries which were turned into a juice serum and were applied to cells before, during and after they had been infected with the influenza virus.
The phytochemicals from the elderberry juice were shown to be effective at stopping the virus infecting the cells, however to the surprise of the researchers they were even more effective at inhibiting viral propagation at later stages of the influenza cycle when the cells had already been infected with the virus.
"This observation was quite surprising and rather significant because blocking the viral cycle at several stages has a higher chance of inhibiting the viral infection," explained Dr Peter Valtchev.
"In addition to that, we identified that the elderberry solution also stimulated the cells to release certain cytokines, which are chemical messengers that the immune system uses for communication between different cell types to coordinate a more efficient response against the invading pathogen," said Centre Director, Professor Fariba Deghani.
The team also found that the elderberry's antiviral activity can be attributed to its anthocyanidin compounds -- phytonutrients responsible for giving the fruit its vivid purple colouring.
Otherwise known as Sambucus nigra, the elderberry is a small, antioxidant rich fruit common to Europe and North America that is still commonly consumed as a jam or wine.
The influenza virus is one of the leading causes of mortality worldwide, affecting nearly 10 per-cent of the world population and contributing to one million deaths annually.
abstract •Elderberry exhibits multiple modes of therapeutic action against influenza infection.
•Elderberry showed a mild inhibitory effect at early stages of the influenza cycle.
•Its impact on the post-infection phase is considerably stronger than the early stages.
•Elderberry possesses immunomodulatory property through stimulation of cytokines.
•Cyn 3-glu had direct effects on influenza but did not stimulate the immune system.
Elderberry extract is effective in treatment of flu. This study aimed to determine the mechanism of action of elderberry and its primary active compound, cyanidin 3-glucoside (cyn 3-glu), against influenza virus. The direct effect was studied via hemagglutination inhibition assay, plaque reduction assay, and flow cytometry analysis. In addition, to assess the indirect immunomodulatory effect, the modulation of pro-inflammatory cytokines was evaluated. Elderberry showed mild inhibitory effect at the early stages of the influenza virus cycle, with considerably stronger effect (therapeutic index of 12 ± 1.3) in the post-infection phase. Our data further support both direct effects of elderberry extract by blocking viral glycoproteins as well as indirect effects by increased expression of IL-6, IL-8, and TNF. Cyn 3-glu despite demonstrating a similar direct mechanism of action (IC50 of 0.069 mg/ml) compared to the elderberry juice, did not affect the expression of pro-inflammatory cytokines. In conclusion, elderberry exhibits multiple modes of therapeutic action against influenza infection.
a missense variant in slc39a8 is associated with severe idiopathic scoliosis
gabe haller et al. 2018
doi.org/10.1038/s41467-018-06705-0
food-safe modification of stainless steel food-processing surfaces to reduce bacterial biofilms
tarek s. awad et al. 2018
doi.org/10.1021/acsami.8b03788
drug synergy slows aging and improves healthspan through igf and srebp lipid signaling
tesfahun dessale admasu et al. 2018
doi.org/10.1016/j.devcel.2018.09.001
subcutaneous fat reduction with injected ice slurry
lilit garibyan et al. 2020
doi.org/10.1097/prs.0000000000006658
injectable ice “slurry,” a sterile solution of normal saline and glycerol (a common food ingredient) containing approximately 20% to 40% small ice particles, similar in texture to slush. The solution can be injected directly into fat deposits, causing the fat cells (adipocytes) to crystallize and die and fat deposits to shrink. The killed adipocytes are gradually eliminated by the body over a period of weeks. “One of the cool things about this is how the injected slurry causes selective effects on fat,” said Rox Anderson, MD, a co-author and leader of the Wellman Center. “Even if the slurry is injected into other tissue such as muscle, there is no significant injury.”
As the investigators report, injection of the ice solution into pigs resulted in a 55% reduction in fat thickness compared to that of pigs injected with the same but melted ice solution. There was no damage to skin or muscle at the injection site, and no systemic side effects or abnormalities seen.
Unlike topical cooling, slurry injection can target and remove fat tissue at essentially any depth and any site that can be accessed by a needle or catheter. Injection of physiological ice slurry could be a transformative method for nonsurgical body contouring.
abstract Cryolipolysis is a non-invasive method for removal of subcutaneous fat for body contouring. Conventional cryolipolysis with topical cooling requires extracting heat from subcutaneous fat by conduction across the skin, thus limiting the amount and the location of the fat removed. We hypothesized that local injection of a physiologic ice slurry directly into target adipose tissue would lead to more efficient and effective cryolipolysis.
Methods:
Injectable slurries containing 20% and 40% ice content were made using common parenteral agents (normal saline and glycerol), then locally injected into the subcutaneous fat of swine. Ultrasound imaging, photography, histological and gross tissue responses were monitored before and periodically up to 8 weeks after injection.
Results:
Fat loss occurred gradually over several weeks following a single ice slurry injection. There was an obvious and significant 55 ± 6% reduction in adipose tissue thickness compared to control sites injected with the same volume of melted slurry (p<0.001, Student’s t test). The amount of fat loss correlated with the total volume of ice injected. There was no scarring or damage to surrounding tissue.
Conclusions:
Physiological ice slurry injection is a promising new strategy for selective and nonsurgical fat removal.
effect of weight loss on upper airway anatomy and the apnea hypopnea index: the importance of tongue fat
stephen h. wang et al. 2020
doi.org/10.1164/rccm.201903-0692oc
sleep apnea, a serious health condition in which breathing repeatedly stops and starts, causing patients to wake up randomly throughout their sleep cycles. The condition, which is usually marked by loud snoring, can increase your risk for high blood pressure and stroke. While obesity is the primary risk factor for developing sleep apnea, there are other causes, such as having large tonsils or a recessed jaw. CPAP (continuous positive airway pressure) machines improves sleep apnea in about 75 percent of patients, studies suggest, but for the other 25 percent — those who may have trouble tolerating the machine — alternative treatment options, such as oral appliances or upper airway surgery, are more complicated.
A 2014 study led by Schwab compared obese patients with and without sleep apnea, and found that the participants with the condition had significantly larger tongues and a higher percentage of tongue fat when compared to those without sleep apnea. The researchers next step was to determine if reducing tongue fat would improve symptoms and to further examine cause and effect.
The new study included 67 participants with mild to severe obstructive sleep apnea who were obese — those with a body mass index greater than 30.0. Through diet or weight loss surgery, the patients lost nearly 10 percent of their body weight, on average, over six months. Overall, the participants’ sleep apnea scores improved by 31 percent after the weight loss intervention, as measured by a sleep study.
Before and after the weight loss intervention, the study participants underwent MRI scans to both their pharynx as well as their abdomens. Then, using a statistical analysis, the research team quantified changes between overall weight loss and reductions to the volumes of the upper airway structures to determine which structures led to the improvement in sleep apnea. The team found that a reduction in tongue fat volume was the primary link between weight loss and sleep apnea improvement.
The study also found that weight loss resulted in reduced pterygoid (a jaw muscle that controls chewing) and pharyngeal lateral wall (muscles on the sides of the airway) volumes. Both these changes also improved sleep apnea, but not to the same extent as the reduction in tongue fat.
The authors believe that tongue fat is a potential new therapeutic target for improving sleep apnea. They suggest that future studies could be designed to explore whether certain low-fat diets are better than others in reducing tongue fat and whether cold therapies — like those used to reduce stomach fat — might be applied to reducing tongue fat. However, Schwab notes, these types of interventions have not yet been tested.
Schwab’s team is also examining new interventions and other risk factors for sleep apnea, including whether some patients who are not obese but who have “fatty” tongues could be predisposed to sleep apnea, but are less likely to be diagnosed.
In a recent related study, Schwab found that ethnicity may also play a role in sleep apnea severity. His research team compared the upper airway anatomy of Chinese and Icelandic patients with sleep apnea, and found that, compared to Icelandic patients of similar age, gender, and symptoms, Chinese patients had smaller airways and soft tissues, but bigger soft palate volume with more bone restrictions. This means that Asian patients may generally be more at risk for severe sleep apnea symptoms. The bottom line, according to Schwab, is that all patients who suffer from snoring or sleepiness should be screened for sleep apnea, whether or not they appear to fall into the typical “high-risk” obese categories.
“Primary care doctors, and perhaps even dentists, should be asking about snoring and sleepiness in all patients, even those who have a normal body mass index, as, based on our data, they may also be at risk for sleep apnea,” Schwab said.
abstract Obesity is the primary risk factor for sleep apnea (OSA). Tongue fat is increased in obese persons with OSA, and may explain the relationship between obesity and OSA. Weight loss improves OSA, but the mechanism is unknown. Objectives: To determine the effect of weight loss on UA anatomy in persons with obesity and OSA. We hypothesized that weight loss would decrease soft tissue volumes and tongue fat and these changes would correlate with reductions in apnea-hypopnea index (AHI). Methods: Sixty-seven individuals with obesity and OSA (AHI≥10 events/hour) underwent a sleep study and UA and abdominal magnetic resonance imaging (MRI) before and after a weight loss intervention (intensive lifestyle modification or bariatric surgery). Airway sizes and soft tissue, tongue fat, and abdominal fat volumes were quantified. Associations between weight loss and changes in these structures, and relationships to AHI changes, were examined. Measurements and Main Results: Weight loss was significantly associated with reductions in tongue fat, pterygoid and total lateral wall volumes. Reductions in tongue fat were strongly correlated with reductions in AHI (rho=0.62, p<0.0001); results remained after controlling for weight loss (rho=0.37, p=0.014). Mediation analyses indicated that reduction in tongue fat volume was the primary mediator of the relationship between weight loss and AHI improvement. Conclusions: Weight loss reduced volumes of several UA soft tissues in persons with obesity and OSA. Improved AHI with weight loss was mediated by reductions in tongue fat. New treatments that reduce tongue fat should be considered for patients with OSA.
urinary
d-mannose: a promising support for acute urinary tract infections in women. a pilot study
domenici l et al. 2016
europeanreview.org/article/11121
αklotho and stgfβr2 treatment counteract the osteoarthritic phenotype developed in a rat model
paloma martinez-redondo et al. 2020
doi.org/10.1007/s13238-019-00685-7
forgotten disease illnesses transformed in chinese medicine
hilary smith 2018 to read next
hydrogen peroxide metabolism in health and disease
maragret visser et al. 2018 to read next
the beautiful cure: harnessing your body’s natural defences
daniel davis 2018
rigor mortis: how sloppy science creates worthless cures, crushes hope, and wastes billions
richard harris 2017
beyond soap: the real truth about what you are doing to your skin and how to fix it for a beautiful, healthy glow
sandy skotnicki 2018
the secret life of fat: the science behind the body’s least understood organ and what it means for you
sylvia tara 2016
spark: the revolutionary new science of exercise and the brain
john j. ratey, eric hagerman 2008
aroused: the history of hormones and how they control just about everything
randi hutter epstein 2018
the secret language of anatomy: an illustrated guide to anatomical terms
cecilia brassett 2018
why we get sick: the new science of darwinian medicine
randolph m. nesse, george c. williams 1994
skin: an intimate journey across our surface
monty lyman 2019
the rules of contagion: why things spread—and why they stop
adam kucharski 2020
if tuck chin in, impossible to snore?
chin jutting, impossible not to?
rarely have coughing caused by own nasal mucus anymore, because have learned to swallow it before it reaches the lungs.
simple muscle tension experiment: if place flat of palm of hand against conductive bone near ear, or on ear, at night, while head rests gently on palm — pillow useful for placement. if tense either muscles in arm or cheek, can hear the subtle vibrations as muscles act in dynamic tension, lack of vibration when all muscles relaxed.
can't promise that you'll never have a sore throat ever again, but I can promise that this will reduce the severity of the symptoms. Sore throats commonly occur when we are ill, and the symptom is a pain in the throat area. The pain comes because the upper airway, just below the epiglottis, has become drier, and the lining of the airway thus becomes more sensitive. Causes of the dryness could include illness slowing mucus production deeper in the lungs, or very dry air in the house or outside. The temporary solution, however, is easy — you can often relieve a painful sore throat in minutes — all you need to do is to swallow. Swallowing, without actually needing to swallow any food or water, is sufficient to encourage local mucus movement and production right where it is needed most. You may want to experiment with swallowing at different intervals or in different patterns.
Swallowing eases the symptom of a sore throat by encouraging wetter mucous from lungs to move faster towards the normal exit point of the oesophagus. Thus the dried out area of the trachea or throat which is hurting is covered by mucous and soothed temporarily.
Seems like common sense once you know it but seems slightly esoteric before you understand why it works
For a longer–term solution, drink water often enough to replenish your internal supplies so you can produce enough mucus to suit the air or illness. The only downside is that you'll no longer be able to complain about having a sore throat when you are ill, so people won't believe your illness is as bad as theirs!
prevent dry–air nosebleeds
wait for new thin film of wet mucous to form between dried mucous and nasal passage before attempting to blow out dried mucous from nose. if this means stifling sneezes for a little while, do so.
investigating the case of human nose shape and climate adaptation
arslan a. zaidi, brooke c. mattern, peter claes, brian mcecoy, cris hughes, mark d. shriver 2017
doi.org/10.1371/journal.pgen.1006616
The evolutionary reasons for variation in nose shape across human populations have been subject to continuing debate. An import function of the nose and nasal cavity is to condition inspired air before it reaches the lower respiratory tract. For this reason, it is thought the observed differences in nose shape among populations are not simply the result of genetic drift, but may be adaptations to climate. To address the question of whether local adaptation to climate is responsible for nose shape divergence across populations, we use Qst–Fst comparisons to show that nares width and alar base width are more differentiated across populations than expected under genetic drift alone. To test whether this differentiation is due to climate adaptation, we compared the spatial distribution of these variables with the global distribution of temperature, absolute humidity, and relative humidity. We find that width of the nares is correlated with temperature and absolute humidity, but not with relative humidity. We conclude that some aspects of nose shape may indeed have been driven by local adaptation to climate. However, we think that this is a simplified explanation of a very complex evolutionary history, which possibly also involved other non-neutral forces such as sexual selection.
Author summary
The study of human adaptation is essential to our understanding of disease etiology. Evolutionary investigations into why certain disease phenotypes such as sickle-cell anemia and lactose intolerance occur at different rates in different populations have led to a better understanding of the genetic and environmental risk factors involved. Similarly, research into the geographical distribution of skin pigmentation continues to yield important clues regarding risk of vitamin D deficiency and skin cancer. Here, we investigate whether variation in the shape of the external nose across populations has been driven by regional differences in climate. We find that variation in both nares width and alar base width appear to have experienced accelerated divergence across human populations. We also find that the geospatial distribution of nares width is correlated with temperature, and absolute humidity, but not with relative humidity. Our results support the claim that local adaptation to climate may have had a role in the evolution of nose shape differences across human populations.
exosome swarms eliminate airway pathogens and provide passive epithelial immunoprotection through nitric oxide
angela l. nocera et al. 2018
doi.org/10.1016/j.jaci.2018.08.046
sneezing causing split lips
if you sneeze suddenly when your lips are very dry, such as after a long night in a dry room, you may split your lips. to prevent this, try to wet your lips before you sneeze.
nose blocked with mucous
first blow as much out as you can, then try to drain by tilting head back and breathing in through nose as much as you can. after a while it will clear. this works by allowing the nose to clear and dry out.
low ambient humidity impairs barrier function and innate resistance against influenza infection
eriko kudo et al. 2019
doi.org/10.1073/pnas.1902840116
cold temperatures and low humidity promote transmission of the flu virus, less is understood about the effect of decreased humidity on the immune system's defenses against flu infection.
The Yale research team, led by Akiko Iwasaki, the Waldemar Von Zedtwitz Professor of Immunobiology, explored the question using mice genetically modified to resist viral infection as humans do. The mice were all housed in chambers at the same temperature, but with either low or normal humidity. They were then exposed to the influenza A virus.
The researchers found that low humidity hindered the immune response of the animals in three ways. It prevented cilia, which are hair-like structures in airways cells, from removing viral particles and mucus. It also reduced the ability of airway cells to repair damage caused by the virus in the lungs. The third mechanism involved interferons, or signaling proteins released by virus-infected cells to alert neighboring cells to the viral threat. In the low-humidity environment, this innate immune defense system failed.
The study offers insight into why the flu is more prevalent when the air is dry. "It's well known that where humidity drops, a spike in flu incidence and mortality occurs. If our findings in mice hold up in humans, our study provides a possible mechanism underlying this seasonal nature of flu disease," said Iwasaki.
While the researchers emphasized that humidity is not the only factor in flu outbreaks, it is an important one that should be considered during the winter season. Increasing water vapor in the air with humidifiers at home, school, work, and even hospital environments is a potential strategy to reduce flu symptoms and speed recovery, they said.
abstract Influenza virus causes seasonal outbreaks in temperate regions, with an increase in disease and mortality in the winter months. Dry air combined with cold temperature is known to enable viral transmission. In this study, we asked whether humidity impacts the host response to influenza virus infections. Exposure of mice to low humidity conditions rendered them more susceptible to influenza disease. Mice housed in dry air had impaired mucociliary clearance, innate antiviral defense, and tissue repair function. Moreover, mice exposed to dry air were more susceptible to disease mediated by inflammasome caspases. Our study provides mechanistic insights for the seasonality of the influenza virus epidemics, whereby inhalation of dry air compromises the host’s ability to restrict influenza virus infection.
In the temperate regions, seasonal influenza virus outbreaks correlate closely with decreases in humidity. While low ambient humidity is known to enhance viral transmission, its impact on host response to influenza virus infection and disease outcome remains unclear. Here, we showed that housing Mx1 congenic mice in low relative humidity makes mice more susceptible to severe disease following respiratory challenge with influenza A virus. We find that inhalation of dry air impairs mucociliary clearance, innate antiviral defense, and tissue repair. Moreover, disease exacerbated by low relative humidity was ameliorated in caspase-1/11–deficient Mx1 mice, independent of viral burden. Single-cell RNA sequencing revealed that induction of IFN-stimulated genes in response to viral infection was diminished in multiple cell types in the lung of mice housed in low humidity condition. These results indicate that exposure to dry air impairs host defense against influenza infection, reduces tissue repair, and inflicts caspase-dependent disease pathology.
influenza virus transmission is dependent on relative humidity and temperature
anice c. lowen et al. 2007
doi.org/10.1371/journal.ppat.0030151
seasonality of respiratory viral infections
miyu moriyama et al. 2020
doi.org/10.1146/annurev-virology-012420-022445
How much spring and summer affect the COVID-19 pandemic may depend not only on the effectiveness of social distancing measures, but also on the environment inside our buildings, according to a review of Yale scientists of their own work and that of colleagues on how respiratory viruses are transmitted.
The cold, dry air of winter clearly helps SARS-CoV2 — the virus that causes COVID-19 — spread among people, Yale research has shown. But as humidity increases during spring and summer, the risk of transmission of the virus through airborne particles decreases both outside and indoors in places such as offices.
While viruses can still be transmitted through direct contact or through contaminated surfaces as humidity rises, researchers suggest that, in addition to social distancing and handwashing, the seasonal moderation of relative humidity — the difference between outside humidity and temperatures and indoor humidity — could be an ally in slowing rates of viral transmission.
The review was published online the week of March 23 in the Annual Review of Virology.
“Ninety percent of our lives in the developed world are spent indoors in close proximity to each other,” said Yale immunobiologist and senior author Akiko Iwasaki. “What has not been talked about is the relationship of temperature and humidity in the air indoors and outdoors and aerial transmission of the virus.”
Iwasaki is the Waldemar Von Zedtwitz Professor of Immunobiology and professor of molecular, cellular, and developmental biology at Yale, and an investigator for the Howard Hughes Medical Institute.
Iwasaki said the seasonal nature of respiratory illnesses have been chronicled since the times of the ancient Greeks, who noted such illnesses rose in winter and fell during spring and summer. Modern science has been able to identify cold, dry air as a factor in spread of viruses such as the novel coronavirus causing COVID-19. Research by Iwasaki’s lab and others explains why.
Winter’s cold, dry air makes such viruses a triple threat, Iwasaki said: When cold outdoor air with little moisture is heated indoors, the air’s relative humidity drops to about 20%. This comparitively moisture-free air provides a clear path for airborne viral particles of viruses such as COVID-19..
Warm, dry air also dampens the ability of cilia, the hair-like projections on cells lining airways, to expel viral particles. And lastly, the immune system’s ability to respond to pathogens is suppressed in drier environments, Iwasaki has found.
Iwasaki was interested in the effects of relative humidity During the winter, relative humidity remains low in most indoor environments; the cold, dry outside air is simply reheated and circulated throughout homes and offices.
Iwasaki’s review cites experiments that show rodents infected with respiratory viruses can easily transmit viral particles through the air to non-infected neighbors in low-humidity environments.
“That’s why I recommend humidifiers during the winter in buildings,” Iwasaki said.
However, in areas of high relative humidity such as the tropics, airborne infectious droplets fall onto surfaces indoors and can survive for extended periods, she said.
“Many homes and buildings are poorly ventilated and people often live in close proximity, and in these cases, the benefits of higher humidity are mitigated,” Iwasaki said.
There is a sweet spot in relative humidity for indoor environments, review found. Mice in environments of between 40% and 60% relative humidity show substantially less ability to transmit viruses to non-infected mice than those in environments of low or high relative humidity. Mice kept at 50% relative humidity were also able to clear an inhaled virus and mount robust immune responses, she found.
Iwasaki stresses that these studies only apply to aerosol transmission: the virus still can be shared at any time of year between people in close proximity and through contact with surfaces containing sufficient amounts of virus. That is why people living in warm countries and people working close to each other are still susceptible to infection, she said.
“It doesn’t matter if you live in Singapore, India, or the Arctic, you still need to wash your hands and practice social distancing,” Iwasaki said.
abstract The seasonal cycle of respiratory viral diseases has been widely recognized for thousands of years, as annual epidemics of the common cold and influenza disease hit the human population like clockwork in the winter season in temperate regions. Moreover, epidemics caused by viruses such as severe acute respiratory syndrome coronavirus (SARS-CoV) and the newly emerging SARS-CoV-2 occur during the winter months. The mechanisms underlying the seasonal nature of respiratory viral infections have been examined and debated for many years. The two major contributing factors are the changes in environmental parameters and human behavior. Studies have revealed the effect of temperature and humidity on respiratory virus stability and transmission rates. More recent research highlights the importance of the environmental factors, especially temperature and humidity, in modulating host intrinsic, innate, and adaptive immune responses to viral infections in the respiratory tract. Here we review evidence of how outdoor and indoor climates are linked to the seasonality of viral respiratory infections. We further discuss determinants of host response in the seasonality of respiratory viruses by highlighting recent studies in the field.
an integrative review of the limited evidence on international travel bans as an emerging infectious disease disaster control measure
nicole a. errett et al. 2020
doi.org/10.5055/jem.2020.0446
travel bans are frequently used to stop the spread of an emerging infectious disease, a new University of Washington and Johns Hopkins University study of published research found that the effectiveness of travel bans is mostly unknown.
However, said lead author Nicole Errett, a lecturer in the UW Department of Environmental & Occupational Health Sciences in the School of Public Health, that’s largely due to the fact that very little research into the effectiveness of travel bans exists.
“Some of the evidence suggests that a travel ban may delay the arrival of an infectious disease in a country by days or weeks. However, there is very little evidence to suggest that a travel ban eliminates the risk of the disease crossing borders in the long term,” said Errett, co-director of the ColLABorative on Extreme Event Resilience, a research lab focused on addressing real-world issues relevant to community resilience.
The researchers combed through thousands of published articles in an effort to identify those that directly addressed travel bans used to reduce the geographic impact of the Ebola virus, SARS (Severe Acute Respiratory Syndrome), MERS (Middle East Respiratory Syndrome) and the Zika virus. They did not include studies of influenza viruses, for which travel bans have already been shown to be ineffective in the long term.
In the end, the researchers were able to identify just six studies that fit their criteria. Those six were based on models or simulations, not data from actual bans after they were implemented, to assess the effectiveness of travel bans in controlling outbreaks. Consequently, to improve research in this area, the study authors recommend that research questions, partnerships and study protocols be established ahead of the next outbreak so empirical data can be collected and assessed quickly.
“Travel bans are one of several legal options that governments have drawn on to mitigate a pandemic,” said co-author Lainie Rutkow, a professor of health policy and management at Johns Hopkins Bloomberg School of Public Health. “As coronavirus spreads, our study raises the importance of understanding the effectiveness of legal and policy responses intended to protect and promote the public’s health.”
“When assessing the need for, and validity of, a travel ban, given the limited evidence, it’s important to ask if it is the least restrictive measure that still protects the public’s health, and even if it is, we should be asking that question repeatedly, and often,”
abstract In our increasingly interconnected world, the potential for emerging infectious diseases (EIDs) to spread globally is of paramount concern. Travel bans—herein defined as the complete restriction of travel from at least one geographic region to at least one other international geographic region—are a potential policy solution to control the global spread of disease. The social, economic, and health-related consequences of travel bans, as well as the available evidence on the effectiveness of travel restrictions in preventing the global spread of influenza, have been previously described. However, the effectiveness of travel bans in reducing the spread of noninfluenza EIDs, characterized by different rates and modes of transmission, is less well understood. This study employs an integrative review approach to summarize the minimal evidence on effectiveness of travel bans to decrease the spread of severe acute respiratory syndrome (SARS), Middle Eastern respiratory syndrome (MERS), Ebola virus disease (EVD), and Zika virus disease (ZVD). We describe and qualify the evidence presented in six modeling studies that assess the effectiveness of travel bans in controlling these noninfluenza EID events. We conclude that there is an urgent need for additional research to inform policy decisions on the use of travel bans and other control measures to control noninfluenza EIDs in advance of the next outbreak.
factors causing dry airways
lipton pluripotent shift
blinking eyes removes contaminants
tiredness, irritability and dehydration — I wonder whether some portion of tantrums in children and adults is from dehydration…
dry airways, bed type, habitual sleeping position (high pillows, face up) increase snoring and sleep apnea
apply plasters, band–aids in a chiral form to suit the appendage — for example, on the finger, do not apply perpendicular or parallel to the length of the finger, but rather in a spiral. this shape allows the appendage to flex in its normal fashion without overflexing the plaster