UT Southwestern Medical Center Above are fluorescent images of genetically identical yeast cells, marked for some of the biomarkers that UT Southwestern researchers discovered help to predict cell fate. The yellow is the nucleus, the blue and green are nucleus-vacuole junction (NVJ) markers, and the red is the yeast vacuole. Biomarkers provide clues to the paths cells take after starvation in findings that may lead to improved drug treatments Newswise — DALLAS – Jan. 11, 2021 – A set of biomarkers not traditionally associated with cell fate can accurately predict how genetically identical cells behave differently under stress, according to a UT Southwestern study. The findings, published by Cell Reports as a Dec. 1 cover story, could eventually lead to more predictable responses to pharmaceutical treatments. Groups of the same types of cells exposed to the same stimuli often display different responses. Some of these responses have been linked to slight differences in genetics between individual cells. However, even genetically identical cells can diverge in behavior. One example can be found in budding yeast, or yeast that are actively dividing. When these microorganisms are deprived of glucose – the sugar molecules they use for energy – all cells stop dividing. However, when this nutrient becomes available again, some cells start dividing once more while others no longer divide but remain alive, even in batches of yeast that are genetic clones. What drives the differences in behavior between these re-dividing “quiescent” cells and never-dividing “senescent” cells has been a mystery, say study leaders N. Ezgi Wood, Ph.D., a postdoctoral fellow at UTSW, and Mike Henne, Ph.D., assistant professor of cell biology and biophysics at UTSW. UT Southwestern Medical Center Mike Henne, Ph.D. Previous studies of behavioral differences in genetically identical cells have focused on genes that decide cell fate. However, Wood, Henne, and their colleagues took a different tact: They looked at the behavior of other biomarkers associated with basic cell maintenance, such as cell cycling, stress response, intracellular communication, and nutrient signaling. The researchers note that the role each of these factors plays in deciding cell fate is not yet clear. Learning more about the factors that prompt cells to act differently could eventually steer researchers in new directions. For example, the knowledge could be useful in helping cells uniformly respond to cancer chemotherapies or antibiotics, areas in which cells often take divergent paths. “How two identical cells side by side take different paths is a very basic biological question – we see it from bacteria to mammalian cells,” Wood says. “Our results show that factors not traditionally associated with cell fate can, in fact, play an important role in this process, and gets us closer to answering the question of why this phenomenon takes place and how we might control it.” To explore these questions, researchers genetically modified yeast cells so that five different protein markers associated with these maintenance tasks glowed with different colors inside the cell when they were present. They then set up an experiment in which these cells lived in a microfluidics chamber that was continuously flushed with liquid media. For two hours, this media was rich with the nutrients that these cells needed to survive and multiply, including glucose. Then, for the next 10 hours, the researchers cut off the glucose supply, starving the cells. At the end of this period, they reintroduced glucose, allowing the cells to recover. During this 16-hour cycle, a camera continuously monitored individual cells, looking for differences between those that became quiescent or senescent when glucose was available again. When they reviewed the camera footage, researchers quickly saw that despite the cells growing in an asynchronized fashion, or at different points in their cell cycles, starvation stopped the cell cycle. A closer look showed that a protein inhibitor of the cell cycle known as Whi5 tended to collect in the nuclei of quiescent cells during starvation, while Whi5 in senescent cells disappeared altogether. Similarly, the two populations exhibited differences in the proteins Msn2 and Rtg1 that are associated with stress response. Although these proteins collected in the nuclei of all the cells when they were starved, they had a sustained presence in the nuclei of senescent cells even after glucose returned, yet largely exited the nuclei of quiescent cells when starvation ended. The researchers found another useful marker for separating these two populations in the nucleus-vacuole junction (NVJ), an interface that connects the nucleus to the vacuole, the small digestive organelle that cells use to sequester waste products. While quiescent cells tended to enlarge their NVJs during starvation, senescent cells did not. Although each of these findings gave clues to which path cells would take after starvation started, none showed any predictive powers before starvation took place. But when the researchers examined Rim15, a protein that plays a key role in nutrient signaling, they found that cells with elevated Rim15 before starvation tended to become quiescent while those with lower concentrations of this protein were more likely to become senescent. On their own, none of these factors served as an accurate predictor of cell fate. But when Wood, Henne, and their colleagues performed a statistical analysis incorporating all of them, they were able to accurately predict which cells became quiescent and which became senescent with an accuracy of nearly 90 percent before they reintroduced glucose. In fact, they say, cells seem to reach a “decision point” where it’s unlikely that they’ll change their direction about four hours into starvation. Other UTSW researchers who contributed to this study include Piya Kositangool, Hanaa Hariri, and Ashley J. Marchand. Henne is a W.W. Caruth, Jr. Scholar in Biomedical Research and a member of the Harold C. Simmons Comprehensive Cancer Center. This research was funded by grants from the Cancer Prevention and Research Institute of Texas (RR150058), The Welch Foundation (I-1873), National Institutes of Health NIGMS (GM119768), the Ara Parseghian Fund (APMRF2020), and UTSW Endowed Scholars Program. About UT Southwestern Medical Center UT Southwestern, one of the premier academic medical centers in the nation, integrates pioneering biomedical research with exceptional clinical care and education. The institution’s faculty has received six Nobel Prizes, and includes 23 members of the National Academy of Sciences, 17 members of the National Academy of Medicine, and 13 Howard Hughes Medical Institute Investigators. The full-time faculty of more than 2,500 is responsible for groundbreaking medical advances and is committed to translating science-driven research quickly to new clinical treatments. UT Southwestern physicians provide care in about 80 specialties to more than 105,000 hospitalized patients, nearly 370,000 emergency room cases, and oversee approximately 3 million outpatient visits a year.
Newswise — In a study to examine a Mediterranean diet in relation to prostate cancer progression in men on active surveillance, researchers from The University of Texas MD Anderson Cancer Center found that men with localized prostate cancer who reported a baseline dietary pattern that more closely follows the key principles of a Mediterranean-style diet fared better over the course of their disease. “Men with prostate cancer are motivated to find a way to impact the advancement of their disease and improve their quality of life,” said Justin Gregg, M.D., assistant professor of Urology and lead author of the study, published today in Cancer. “A Mediterranean diet is non-invasive, good for overall health and, as shown by this study, has the potential to effect the progression of their cancer.” After adjusting for factors known to increase risk of cancer getting worse over time, such as age, prostate-specific antigen (PSA) and tumor volume, men with a diet that contained more fruits, vegetables, legumes, cereals and fish had a reduced risk of their prostate cancer growing or advancing to a point where many would consider active treatment. The researchers also examined the effect of diabetes and statin use and found a similar risk reduction in these patient groups. The study, whose largest number of participants were white, also found that the effect of a Mediterranean diet was more pronounced in African American participants and others who self-identified as non-white. These findings are significant as the rate of prostate cancer diagnosis is more than 50% higher in African American men, who also have a higher risk of prostate cancer death and disease progression. “The Mediterranean diet consistently has been linked to lower risk of cancer, cardiovascular disease and mortality. This study in men with early stage prostate cancer gets us another step closer to providing evidence-based dietary recommendations to optimize outcomes in cancer patients, who along with their families, have many questions in this area,” said Carrie Daniel-MacDougall, Ph.D., associate professor of Epidemiology and senior author of the study. After skin cancer, prostate cancer is the most common cancer in men in the United States. Since most cases are low-risk disease, localized to the prostate and have favorable outcomes, many men do not need immediate treatment and opt for active surveillance by their doctor. Treatments for prostate cancer can cause changes in quality of life and declines in urinary and sexual function, therefore there is interest in finding modifiable factors for men managed by active surveillance. The study followed 410 men on an active surveillance protocol with Gleason grade group 1 or 2 localized prostate cancer. All study participants underwent a confirmatory biopsy at the beginning of the study and were evaluated every six months through clinical exam and laboratory studies of serum antigen PSA and testosterone. Trial participants were 82.9% Caucasian, 8.1% Black and 9% other or unknown. The median age was 64, 15% of the men were diabetic and 44% used statins. The men completed a 170-item baseline food frequency questionnaire, and Mediterranean diet score was calculated for each participant across 9 energy-adjusted food groups. The participants were then divided into three groups of high, medium and low adherence to the diet. After adjustments for age and clinical characteristics, researchers saw a significant association between high baseline diet score and lower risk of cancer grade progression. For every one-point increase in the Mediterranean diet score, researchers observed a >10% lower risk of progression. After a median follow-up of 36 months, 76 men saw their cancer progress. The study was limited by the low number of events in these men with mostly low risk disease monitored at MD Anderson. Future research is needed to see if the same effects are seen for larger and more diverse patient groups and men with higher-risk prostate cancer. “Our findings suggest that consistently following a diet rich in plant foods, fish and a healthy balance of monounsaturated fats may be beneficial for men diagnosed with early-stage prostate cancer,” Gregg said. “We are hopeful that these results, paired with additional research and future validation, will encourage patients to adapt a healthy lifestyle.” This research was supported by the Department of Defense Prostate Cancer Research Program Early Career Award (W81XWH-18-1-0193), a National Cancer Institute Cancer Center Support Grant to MD Anderson (CCSG 5P30 CA016672-37) and a Research Training Award for Cancer Prevention Post-Graduate Training Program in Integrative Epidemiology from the Cancer Prevention & Research Institute of Texas (RP160097). A full list of co-authors and disclosures can be found with the full paper here.
Newswise — Wildfire smoke contains microbes, a fact that’s often ignored, but one that may have important health repercussions. In a perspective essay published in Science, Leda Kobziar and George Thompson call the attention of the scientific community to the health impacts of wildfire smoke’s microbial content. Smoky skies caused by wildland fires are becoming seasonal norms, especially in some parts of the United States and Australia. In 2020, raging wildfires in the Western U.S. have set new records and led to extremely unhealthy or hazardous air quality levels for many weeks in a row. It’s well-documented that exposure to wildfire smoke can damage the heart and lungs. Respiratory allergic and inflammatory diseases, including asthma and bronchitis, are also worsened by smoke exposure. “The health impact of inhaling wildfire smoke increases dramatically during high-emissions wildfires and with long exposure,” said Kobziar, associate professor of Wildland Fire Science at the University of Idaho. “Yet, the risk of infection to the respiratory tract after this exposure is frequently overlooked.” What role do microbes in wildfire smoke play in the spread of disease? Wildland fire is a source for bioaerosol, airborne particles made of fungal and bacterial cells and their metabolic byproducts. Once suspended in the air, particles smaller than 5 μm can travel hundreds or even thousands of miles. Their movement depends on the fire behavior and the atmospheric conditions. Eventually, they are deposited or inhaled. Bacteria and fungi can be transported in these wildland fire smoke emissions. While microbial concentration in smoke is higher near the fire source, these microbes may be active agents spreading infection. For example, coccidioidomycoses - a fungus that becomes airborne when soils are disturbed- is the cause of Valley fever, a potentially serious infection. “We don’t know how far and which microbes are carried in smoke,” said Thompson, associate professor of Clinical Medicine at UC Davis. “Some microbes in the soil appear to be tolerant of, and even thrive under, high temperatures following wildfires.” As Kobziar explained, “At the scale of a microbe, fire behavior research has shown that heat flux is highly variable, so it may be that many microbes aren’t even subjected to the high temperatures for very long. They may also be protected in small clusters of particulate matter.” Kobziar and Thompson proposed a multidisciplinary approach to understanding the nature of the relationship between microbes, wildfire smoke and health. The complexity of the phenomenon calls for the expertise of scientists from different fields such as fire ecology, environmental microbiology, epidemiology, atmospheric sciences and public health and infectious disease. “With longer wildfire seasons and higher severity trends, there is an urgency to work together in studying the behavior of the microbes carried by the smoke and their impact on human health,” Thompson said. Article: Kobziar & Thompson, (2020). Wildfire smoke: A potential infectious agent, Science, DOI: 0.1126/science.abe8116
Newswise — Temperature data collected by wearable devices worn on the finger can be reliably used to detect the onset of fevers, a leading symptom of both COVID-19 and the flu, according to a team of researchers from the University of California San Diego, UC San Francisco and MIT Lincoln Lab. Researchers published their results in a paper titled “Feasibility of continuous fever monitoring using wearable devices” in the Dec. 14 issue of the journal Scientific Reports. They emphasize that the study is a proof-of-concept effort with data from only 50 participants reporting COVID-19. The Scientific Reports paper is the first published result from TemPredict, a study of more than 65,000 people wearing a ring manufactured by Finnish startup Oura, that records temperature, heart rate, respiratory rate and levels of activity. The goal of the study is to develop an algorithm that can predict the onset of symptoms such as fever, cough and fatigue, which are characteristic of COVID-19. Researchers say they hope to reach that goal by the end of the year. They also hope the algorithms will allow public health officials to act faster to contain the virus’ spread. “This isn’t just a science problem, it’s a social problem,” said Benjamin Smarr, the paper’s corresponding author and a professor in the Department of Bioengineering and the Halicioglu Data Sciences Institute at UC San Diego. “With wearable devices that can measure temperature, we can begin to envision a public COVID early alert system.” But users from diverse backgrounds would need to feel safe sharing their data for such efforts to really work, Smarr added. The data is stripped of all personal information, including location, and each subject is known by a random identifying number. Smarr is TemPredict’s data analytics lead. Ashley Mason, a professor in the Department of Psychiatry and the Osher Center for Integrative Medicine at UC San Francisco, is the principal investigator of the study. “If wearables allow us to detect COVID-19 early, people can begin physical isolation practices and obtain testing so as to reduce the spread of the virus,” Mason said. In this way, an ounce of prevention may be worth even more than a pound of cure.” Wearables such as the Oura ring can collect temperature data continuously throughout the day and night, allowing researchers to measure people’s true temperature baselines and identify fever peaks more accurately. “Temperature varies not only from person to person but also for the same person at different times of the day,” Smarr said. The study, he explains, highlights the importance of collecting data continuously over long periods of time. Incidentally, the lack of continuous data is also why temperature spot checks are not effective for detecting COVID-19. These spot checks are the equivalent of catching a syllable per minute in a conversation, rather than whole sentences, Smarr said. In the Scientific Reports paper, Smarr and colleagues noticed that fever onset often happened before subjects were reporting symptoms, and even to those who never reported other symptoms. “It supports the hypothesis that some fever-like events may go unreported or unnoticed without being truly asymptomatic,” the researchers write. “Wearables therefore may contribute to identifying rates of asymptomatic [illness] as opposed to unreported illness, [which is] of special importance in the COVID-19 pandemic.” The 50 subjects in the study all owned Oura rings and had had COVID-19 before joining TemPredict. They provided symptom summaries for their illnesses and gave researchers access to the data their Oura rings had collected during the period when they were sick. The signal for fever onset was not subtle, Smarr said. “The chart tracking people who had a fever looked like it was on fire.” The data collected as part of the subsequent TemPredict study included 65,000 subjects, and these data will be stored at the San Diego Supercomputer Center at UC San Diego, where a team led by Ilkay Altintas is building a portal to enable other researchers to access these data for other analyses. “The data collected has great potential to be linked with other datasets making individual and societal scale models be combined to further understand the disease,” said Ilkay Altintas, the chief data science officer at the San Diego Supercomputer Center, who is . The easier we can make to share the data and optimize the use of it through digital technologies, the quicker other researchers will make use of it in their studies.” Researchers also are keeping up efforts to recruit a diverse pool of subjects that reflects the U.S. population. “We need to make sure that our algorithms work for everyone,” Smarr said. In future, researchers plan to expand their early detection methods to other infectious diseases, such as the flu. Smarr has worked as a consultant with Oura within the last 12 month and received compensation, although not during this research project. Feasibility of continuous fever monitoring using wearable devices Benjamin Smarr, UC San Diego Department of Bioengineering and Halicioglu Data Science Institute Kirstin Aschbacher, UC San Francisco and Oura Sarah M. Fisher, Anoushka Chowdhary, Kerena Puldon, Adam Rao, Frederick Hecht and Ashley E. Mason UCSF Stephan Dilchert, City University of New York and preValio LLC, Minneapolis Photo Credit: Oura While not an FDA registered healthcare device, the Oura ring monitors a range of signals, including continuous temperature, heart rate, respiration rate and activity. Initial analysis suggests that a destabilization of temperature happens a couple of days before coronavirus symptoms manifest. The Oura ring detects this pattern.
Newswise — DETROIT – Henry Ford Health System is the first in the country to perform a procedure using the CG-100™ intraluminal device, which is temporarily inserted into the gastrointestinal tract and designed to reduce diverting stoma rates, and the need for an ostomy bag, in patients undergoing gastrointestinal resection procedures due to colorectal cancer treatment.When a patient has a cancerous part of their intestine surgically removed, the intestinal tract must be reconnected. While the surgically reconnected intestinal tract is healing, patients typically require a stoma, also known as an ostomy, which is a surgical opening on their abdomen that connects to their digestive tract. The stoma diverts digestive waste into an ostomy bag outside the abdomen and keeps it away from the site of reconnection. The ostomy must be reversed when it is no longer needed, which requires additional surgery and recovery time for the patient.Created by Colospan Ltd., the CG-100 device is a silicone sheath that is introduced into the intestinal tract through the rectum, no surgery required, and covers the site where the intestinal tract has been reconnected. This sheath aims to prevent or reduce the contact of fecal material with the site of reconnection, avoiding the need for a stoma. After 10 days, when the risk for leakage is reduced, the sheath is removed without any additional surgery.“One of the most serious complications that can happen after part of the gastrointestinal tract is surgically removed is a leak at the site of the resection,” said Craig Reickert, M.D., division head of Colon and Rectal Surgery at Henry Ford Cancer Institute. “While an ostomy procedure can reduce the risk of this type of leak, it is invasive and can be challenging for the patient to live with. One of the greatest potential benefits of this device is that it not only reduces the need for a stoma, it also does not require additional surgery to implant or remove.”The first CG-100 procedure, which was performed at Henry Ford Hospital by colon and rectal surgeon Surya Nalamati, M.D., is part of a Food and Drug Administration Investigational Device Exemption clinical trial that is comparing the CG-100 device to a diverting stoma, the current standard of care treatment for colorectal surgery. Henry Ford is the only site in Michigan, and one of just 12 sites nationwide, currently enrolling colorectal cancer patients in this clinical trial.The CG-100 clinical trial is multi-center and randomized, so patients who meet study requirements and agree to enter the study are randomized to either be treated with the CG-100 intraluminal bypass device or to receive a diverting stoma. The patient’s care team will continue to follow up with them for up to 39 weeks after surgery.According to the Centers for Disease Control and Prevention, in 2017 – the latest year for which incidence data are available – 141,425 new cases of colorectal cancer were reported in the United States, and 52,547 people in the U.S. died of this cancer. While an ostomy procedure can be lifesaving for colorectal cancer patients, it also carries with it risks of complications, such as incisional hernia, surgical site infection and anastomotic stenosis.To learn more about colorectal cancer treatment at Henry Ford Cancer Institute or to request an appointment at with a colorectal cancer specialist, visit henryford.com/services/colon-rectal-cancer. Photo Credit: Colospan Ltd. The CG-100™ intraluminal bypass device is temporarily inserted into the gastrointestinal tract and designed to reduce diverting stoma rates, and the need for an ostomy bag, in patients undergoing gastrointestinal resection procedures due to colorectal cancer treatment.
A novel form of an Alzheimer’s protein found in the fluid that surrounds the brain and spinal cord indicates what stage of the disease a person is in, and tracks with tangles of tau protein in the brain, according to a study from researchers at Washington University School of Medicine in St. Louis. Tau tangles are thought to be toxic to neurons, and their spread through the brain foretells the death of brain tissue and cognitive decline. Tangles appear as the early, asymptomatic stage of Alzheimer’s develops into the symptomatic stage. The discovery of so-called microtubule binding region tau (MTBR tau) in the cerebrospinal fluid could lead to a way to diagnose people in the earliest stages of Alzheimer’s disease, before they have symptoms or when their symptoms are still mild and easily misdiagnosed. It also could accelerate efforts to find treatments for the devastating disease, by providing a relatively simple way to gauge whether an experimental treatment slows or stops the spread of toxic tangles. The study is published Dec. 7 in the journal Brain. “This MTBR tau fluid biomarker measures tau that makes up tangles and can confirm the stage of Alzheimer’s disease by indicating how much tau pathology is in the brains of Alzheimer’s disease patients,” said senior author Randall J. Bateman, MD, the Charles F. and Joanne Knight Distinguished Professor of Neurology. Bateman treats patients with Alzheimer’s disease on the Washington University Medical Campus. “If we can translate this into the clinic, we’d have a way of knowing whether a person’s symptoms are due to tau pathology in Alzheimer’s disease and where they are in the disease course, without needing to do a brain scan. As a physician, this information is invaluable in informing patient care, and in the future, to guide treatment decisions.” Alzheimer’s begins when a brain protein called amyloid starts forming plaques in the brain. During this amyloid stage, which can last two decades or more, people show no signs of cognitive decline. However, soon after tangles of tau begin to spread in the neurons, people start exhibiting confusion and memory loss, and brain scans show increasing atrophy of brain tissue. Tau tangles can be detected by positron emission tomography (PET) brain scans, but brain scans are time-consuming, expensive and not available everywhere. Bateman and colleagues are developing diagnostic blood tests for Alzheimer’s disease based on amyloid or a different form of tau, but neither test can pin down the amount of tau tangles across the stages of disease. MTBR tau is an insoluble piece of the tau protein, and the primary component of tau tangles. Bateman and first author Kanta Horie, PhD, a visiting scientist in Bateman’s lab, realized that specific MTBR tau species were enriched in the brains of people with Alzheimer’s disease, and that measuring levels of the species in the cerebrospinal fluid that bathes the brain might be a way to gauge how broadly the toxic tangles have spread through the brain. Previous researchers using antibodies against tau had failed to detect MTBR tau in the cerebrospinal fluid. But Horie and colleagues developed a new method based on using chemicals to purify tau out of a solution, followed by mass spectrometry. Using this technique, Horie, Bateman and colleagues analyzed cerebrospinal fluid from 100 people in their 70s. Thirty had no cognitive impairment and no signs of Alzheimer’s; 58 had amyloid plaques with no cognitive symptoms, or with mild or moderate Alzheimer’s dementia; and 12 had cognitive impairment caused by other conditions. The researchers found that levels of a specific form — MTBR tau 243 — in the cerebrospinal fluid were elevated in the people with Alzheimer’s and that it increased the more advanced a person’s cognitive impairment and dementia were. The researchers verified their results by following 28 members of the original group over two to nine years. Half of the participants had some degree of Alzheimer’s at the start of the study. Over time, levels of MTBR tau 243 significantly increased in the Alzheimer’s disease group, in step with a worsening of scores on tests of cognitive function. The gold standard for measuring tau in the living brain is a tau-PET brain scan. The amount of tau visible in a brain scan correlates with cognitive impairment. To see how their technique matched up to the gold standard, the researchers compared the amount of tau visible in brain scans of 35 people — 20 with Alzheimer’s and 15 without — with levels of MTBR tau 243 in the cerebrospinal fluid. MTBR tau 243 levels were highly correlated with the amount of tau identified in the brain scan, suggesting that their technique accurately measured how much tau — and therefore damage — had accumulated in the brain. “Right now there is no biomarker that directly reflects brain tau pathology in cerebrospinal fluid or the blood,” Horie said. “What we’ve found here is that a novel form of tau, MTBR tau 243, increases continuously as tau pathology progresses. This could be a way for us to not only diagnose Alzheimer’s disease but tell where people are in the disease. We also found some specific MTBR tau species in the space between neurons in the brain, which suggests that they may be involved in spreading tau tangles from one neuron to another. That finding opens up new windows for novel therapeutics for Alzheimer’s disease based on targeting MTBR tau to stop the spread of tangles.” Photo Credit: Tammie Benzinger/Knight ADRC A “heat map” of the brain of a person with mild Alzheimer’s dementia shows where tau protein has accumulated, with areas of higher density in red and orange, and lower density in green and blue. Researchers at Washington University School of Medicine in St. Louis have found a form of tau in spinal fluid that tracks with tau tangles in the brain and indicates what stage of the disease a person is in.
Newswise — BINGHAMTON, NY -- Bacterial infections have become one of the biggest health problems worldwide, and a recent study shows that COVID-19 patients have a much greater chance of acquiring secondary bacterial infections, which significantly increases the mortality rate. Combatting the infections is no easy task, though. When antibiotics are carelessly and excessively prescribed, that leads to the rapid emergence and spread of antibiotic-resistant genes in bacteria — creating an even larger problem. According to the Centers for Disease Control and Prevention, 2.8 million antibiotic-resistant infections happen in the U.S. each year, and more than 35,000 people die from of them. One factor slowing down the fight against antibiotic-resistant bacteria is the amount of time needed to test for it. The conventional method uses extracted bacteria from a patient and compares lab cultures grown with and without antibiotics, but results can take one to two days, increasing the mortality rate, the length of hospital stay and overall cost of care. Associate Professor Seokheun “Sean” Choi — a faculty member in the Department of Electrical and Computer Engineering at Binghamton University’s Thomas J. Watson College of Engineering and Applied Science — is researching a faster way to test bacteria for antibiotic resistance. “To effectively treat the infections, we need to select the right antibiotics with the exact dose for the appropriate duration,” he said. “There’s a need to develop an antibiotic-susceptibility testing method and offer effective guidelines to treat these infections.” In the past few years, Choi has developed several projects that cross “papertronics” with biology, such as one that developed biobatteries using human sweat. This new research — titled “A simple, inexpensive, and rapid method to assess antibiotic effectiveness against exoelectrogenic bacteria” and published in November’s issue of the journal Biosensors and Bioelectronics — relies on the same principles as the batteries: Bacterial electron transfer, a chemical process that certain microorganisms use for growth, overall cell maintenance and information exchange with surrounding microorganisms. “We leverage this biochemical event for a new technique to assess the antibiotic effectiveness against bacteria without monitoring the whole bacterial growth,” Choi said. “As far as I know, we are the first to demonstrate this technique in a rapid and high-throughput manner by using paper as a substrate.” Working with PhD students Yang Gao (who earned his degree in May and is now working as a postdoctoral researcher at the University of Texas at Austin), Jihyun Ryu and Lin Liu, Choi developed a testing device that continuously monitors bacteria’s extracellular electron transfer. A medical team would extract a sample from a patient, inoculate the bacteria with various antibiotics over a few hours and then measure the electron transfer rate. A lower rate would mean that the antibiotics are working. “The hypothesis is that the antiviral exposure could cause sufficient inhibition to the bacterial electron transfer, so the readout by the device would be sensitive enough to show small variations in the electrical output caused by changes in antibiotic effectiveness,” Choi said. The device could provide results about antibiotic resistance in just five hours, which would serve as an important point-of-care diagnostic tool, especially in areas with limited resources. The prototype — built in part with funding from the National Science Foundation and the U.S. Office of Naval Research — has eight sensors printed on its paper surface, but that could be extended to 64 or 96 sensors if medical professionals wanted to build other tests into the device. Building on this research, Choi already knows where he and his students would like to go next: “Although many bacteria are energy-producing, some pathogens do not perform extracellular electron transfer and may not be used directly in our platform. However, various chemical compounds can assist the electron transfer from non-electricity-producing bacteria. “For instance, E. coli cannot transfer electrons from the inside of the cell to the outside, but with the addition of some chemical compounds, they can generate electricity. Now we are working on how to make this technique general to all bacteria cells.”
When the protein TRPC1 is exposed to weak magnetic fields, it stimulates muscle cells to respond as if the body has exercised Newswise — As people age, they progressively lose muscle mass and strength, and this can lead to frailty and other age-related diseases. As the causes for the decline remain largely unknown, promoting muscle health is an area of great research interest. A recent study led by the researchers from the National University of Singapore (NUS) has shown how a molecule found in muscles responds to weak magnetic fields to promote muscle health. Led by Associate Professor Alfredo Franco-Obregón from the NUS Institute for Health Innovation and Technology (iHealthtech), the team found that a protein known as TRPC1 responds to weak oscillating magnetic fields. Such a response is normally activated when the body exercises. This responsiveness to magnets could be used to stimulate muscle recovery, which could improve the life quality for patients with impaired mobility, in an increasingly ageing society. “The use of pulsed magnetic fields to simulate some of the effects of exercise will greatly benefit patients with muscle injury, stroke, and frailty as a result of advanced age,” said lead researcher Assoc Prof Franco-Obregón, who is also from the NUS Department of Surgery. The NUS research team collaborated with the Swiss Federal Institute of Technology (ETH) on this study, and their results were first published online in Advanced Biosystems on 2 September 2020. The work was also featured on the cover of the journal’s print edition on 27 November 2020. Magnets and muscle health The magnetic fields that the research team used to stimulate the muscle health were only 10 to 15 times stronger than the Earth’s magnetic field, yet still much weaker than a common bar magnet, raising the intriguing possibility that weak magnetism is a stimulus that muscles naturally interact with. To test this theory, the research team first used a special experimental setup to cancel the effect of all surrounding magnetic fields. The researchers found that the muscle cells indeed grew more slowly when shielded from all environmental magnetic fields. These observations strongly supported the notion that the Earth’s magnetic field naturally interacts with muscles to elicit biological responses. To show the involvement of TRPC1 as an antenna for natural magnetism to promote muscle health, the researchers genetically engineered mutant muscle cells that were unresponsive to any magnetic field by deleting TRPC1 from their genomes. The researchers were then able to reinstate magnetic sensitivity by selectively delivering TRPC1 to these mutant muscle cells in small vesicles that fused with the mutant cells. In their previous studies, the researchers have shown that response to such magnetic fields were strongly correlated to the presence of TRPC1, and it included the rejuvenation of cartilage by indirectly regulating the gut microbiome, fat burning and insulin-sensitivity via positive actions on muscle. The present study provided conclusive evidence that TRPC1 serves as an ubiquitous biological antenna to surrounding magnetic fields to modulate human physiology, particularly when targeted for muscle health. Metabolic changes similar to those achieved with exercise have been observed in previous clinical trials and studies led by Assoc Prof Franco-Obregón. Encouraging benefits of using the magnetic fields to stimulate muscle cells have been found, with as little as 10 minutes of exposure per week. This tantalising possibility, to improve muscle health without exercising, could facilitate recovering and rehabilitation of patients with muscle dysfunction. Assoc Prof Franco-Obregón shared, “About 40 per cent of an average person’s body is muscle. Our results demonstrate a metabolic interaction between muscle and magnetism which hopefully can be exploited to improve human health and longevity.” Next steps This study represents a milestone in the understanding of how a key protein may developmentally react to magnetic fields. Metabolic health such as weight, blood sugar levels, insulin, and cholesterol are strongly influenced by muscle health. As exercise is a strong modulator of metabolic diseases through the working of the muscles, and magnetic fields exert similar benefits of exercise, such magnetism may help patients who are unable to undertake exercise because of injury, disease, or frailty. As such, the NUS iHealthtech research team is now working to extend their study to reduce drug dependence for the treatment of diseases such as diabetes. “We hope that our research can help alleviate side effects by reducing the use of drugs for disease treatment, and to improve the quality of life of the patients,” said Assoc Prof Franco-Obregón. This project has recently won the Catalyst Award in the inaugural Healthy Longevity Catalyst Awards conferred by the US National Academy of Medicine. The team was recognised for their breakthrough innovation to extend human health and function later in life. Photo Credit: National University of Singapore Associate Professor Alfredo Franco-Obregón and his team from the NUS Institute for Health Innovation and Technology examined how low amplitude magnetic fields may be used to enhance muscle metabolism. The images on the screen show the cells of two types of muscles - the blue fibres (left) are rapidly fatiguing muscles, the green fibres (right) are slowly fatiguing muscle, and the red fibres are considered transitional fibres.
Researchers conclude that regularly speaking two languages contributes to cognitive reserve and delays the onset of the symptoms associated with cognitive decline and dementia. Newswise — In addition to enabling us to communicate with others, languages are our instrument for conveying our thoughts, identity, knowledge, and how we see and understand the world. Having a command of more than one enriches us and offers a doorway to other cultures, as discovered by a team of researchers led by scientists at the Open University of Catalonia (UOC) and Pompeu Fabra University (UPF). Using languages actively provides neurological benefits and protects us against cognitive decline associated with ageing. In a study published in the journal Neuropsychologia, the researchers conclude that regularly speaking two languages -and having done so throughout one's life- contributes to cognitive reserve and delays the onset of the symptoms associated with cognitive decline and dementia. "We have seen that the prevalence of dementia in countries where more than one language is spoken is 50% lower than in regions where the population uses only language to communicate", asserts researcher Marco Calabria, a member of the Speech Production and Bilingualism research group at UPF and of the Cognitive NeuroLab at the UOC, and professor of Health Sciences Studies, also at the UOC. Previous work had already found that the use of two or more languages throughout life could be a key factor in increasing cognitive reserve and delaying the onset of dementia; also, that it entailed advantages of memory and executive functions. "We wanted to find out about the mechanism whereby bilingualism contributes to cognitive reserve with regard to mild cognitive impairment and Alzheimer's, and if there were differences regarding the benefit it confers between the varying degrees of bilingualism, not only between monolingual and bilingual speakers", points out Calabria, who led the study. Thus, and unlike other studies, the researchers defined a scale of bilingualism: from people who speak one language but are exposed, passively, to another, to individuals who have an excellent command of both and use them interchangeably in their daily lives. To construct this scale, they took several variables into account such as the age of acquisition of the second language, the use made of each, or whether they were used alternatively in the same context, among others. The researchers focused on the population of Barcelona, where there is strong variability in the use of Catalan and Spanish, with some districts that are predominantly Catalan-speaking and others where Spanish is mainly spoken. "We wanted to make use of this variability and, instead of comparing monolingual and bilingual speakers, we looked at whether within Barcelona, where everyone is bilingual to varying degrees, there was a degree of bilingualism that presented neuroprotective benefits", Calabria explains. Bilingualism and Alzheimer's At four hospitals in the Barcelona and metropolitan area, they recruited 63 healthy individuals, 135 patients with mild cognitive impairment, such as memory loss, and 68 people with Alzheimer's, the most prevalent form of dementia. They recorded their proficiency in Catalan and Spanish using a questionnaire and established the degree of bilingualism of each subject. They then correlated this degree with the age at which the subjects' neurological diagnosis was made and the onset of symptoms. To better understand the origin of the cognitive advantage, they asked the participants to perform various cognitive tasks, focusing primarily on the executive control system, since the previous studies had suggested that this was the source of the advantage. In all, participants performed five tasks over two sessions, including memory and cognitive control tests. "We saw that people with a higher degree of bilingualism were given a diagnosis of mild cognitive impairment later than people who were passively bilingual", states Calabria, for whom, probably, speaking two languages and often changing from one to the other is life-long brain training. According to the researcher, this linguistic gymnastics is related to other cognitive functions such as executive control, which is triggered when we perform several actions simultaneously, such as when driving, to help filter relevant information. The brain's executive control system is related with the control system of the two languages: it must alternate them, make the brain focus on one and then on the other so as not to cause one language to intrude in the other when speaking. "This system, in the context of neurodegenerative diseases, might offset the symptoms. So, when something does not work properly as a result of the disease, the brain has efficient alternative systems to solve it thanks to being bilingual", Calabria states, who then continues: "we have seen that the more you use two languages and the better language skills you have, the greater the neuroprotective advantage. Active bilingualism is, in fact, an important predictor of the delay in the onset of the symptoms of mild cognitive impairment, a preclinical phase of Alzheimer's disease, because it contributes to cognitive reserve". Now, the researchers wish to verify whether bilingualism is also beneficial for other diseases, such as Parkinson's or Huntington's disease.
Newswise — DALLAS, November 16, 2020 -- Adults with the healthiest sleep patterns had a 42% lower risk of heart failure regardless of other risk factors compared to adults with unhealthy sleep patterns, according to new research published today in the American Heart Association's flagship journal Circulation. Healthy sleep patterns are rising in the morning, sleeping 7-8 hours a day and having no frequent insomnia, snoring or excessive daytime sleepiness. Heart failure affects more than 26 million people, and emerging evidence indicates sleep problems may play a role in the development of heart failure. This observational study examined the relationship between healthy sleep patterns and heart failure and included data on 408,802 UK Biobank participants, ages 37 to 73 at the time of recruitment (2006-2010). Incidence of heart failure was collected until April 1, 2019. Researchers recorded 5,221 cases of heart failure during a median follow-up of 10 years. Researchers analyzed sleep quality as well as overall sleep patterns. The measures of sleep quality included sleep duration, insomnia and snoring and other sleep-related features, such as whether the participant was an early bird or night owl and if they had any daytime sleepiness (likely to unintentionally doze off or fall asleep during the daytime). "The healthy sleep score we created was based on the scoring of these five sleep behaviors," said Lu Qi, M.D., Ph.D., corresponding author and professor of epidemiology and director of the Obesity Research Center at Tulane University in New Orleans. "Our findings highlight the importance of improving overall sleep patterns to help prevent heart failure." Sleep behaviors were collected through touchscreen questionnaires. Sleep duration was defined into three groups: short, or less than 7 hours a day; recommended, or 7 to 8 hours a day; and prolonged, or 9 hours or more a day. After adjusting for diabetes, hypertension, medication use, genetic variations and other covariates, participants with the healthiest sleep pattern had a 42% reduction in the risk of heart failure compared to people with an unhealthy sleep pattern. They also found the risk of heart failure was independently associated and: 8% lower in early risers; 12% lower in those who slept 7 to 8 hours daily; 17% lower in those who did not have frequent insomnia; and 34% lower in those reporting no daytime sleepiness. Participant sleep behaviors were self-reported, and the information on changes in sleep behaviors during follow-up were not available. The researchers noted other unmeasured or unknown adjustments may have also influenced the findings. Qi also noted that the study's strengths include its novelty, prospective study design and large sample size. First-author is Xiang Li, Ph.D.; other co-authors are Qiaochu Xue, M.P.H.; Mengying Wang, M.P.H.; Tao Zhou, Ph.D.; Hao Ma, Ph.D.; and Yoriko Heianza, Ph.D. Author disclosures are detailed in the manuscript.