News

Newswise —  A large worldwide study has found that, contrary to popular thought, low-salt diets may not be beneficial and may actually increase the risk of cardiovascular disease (CVD) and death compared to average salt consumption. In fact, the study suggests that the only people who need to worry about reducing sodium in their diet are those with hypertension (high blood pressure) and have high salt consumption. The study, involving more than 130,000 people from 49 countries, was led by investigators of the Population Health Research Institute (PHRI) of McMaster University and Hamilton Health Sciences. They looked specifically at whether the relationship between sodium (salt) intake and death, heart disease and stroke differs in people with high blood pressure compared to those with normal blood pressure. The researchers showed that regardless of whether people have high blood pressure, low-sodium intake is associated with more heart attacks, strokes, and deaths compared to average intake. “These are extremely important findings for those who are suffering from high blood pressure,” said Andrew Mente, lead author of the study, a principal investigator of PHRI and an associate professor of clinical epidemiology and biostatistics at McMaster’s Michael G. DeGroote School of Medicine. “While our data highlights the importance of reducing high salt intake in people with hypertension, it does not support reducing salt intake to low levels. “Our findings are important because they show that lowering sodium is best targeted at those with hypertension who also consume high sodium diets.” Current intake of sodium in Canada is typically between 3.5 and 4 grams per day and some guidelines have recommended that the entire population lower sodium intake to below 2.3 grams per day, a level that fewer than five per cent of Canadians and people around the world consume. Previous studies have shown that low-sodium, compared to average sodium intake, is related to increased cardiovascular risk and mortality, even though low sodium intake is associated with lower blood pressure. This new study shows that the risks associated with low-sodium intake – less than three grams per day – are consistent regardless of a patient’s hypertension status. Further, the findings show that while there is a limit below which sodium intake may be unsafe, the harm associated with high sodium consumption appears to be confined to only those with hypertension. Only about 10 per cent of the population in the global study had both hypertension and high sodium consumption (greater than 6 grams per day). Mente said that this suggests that the majority of individuals in Canada and most countries are consuming the right amount of salt. He added that targeted salt reduction in those who are most susceptible because of hypertension and high salt consumption may be preferable to a population-wide approach to reducing sodium intake in most countries except those where the average sodium intake is very high, such as parts of central Asia or China. He added that what is now generally recommended as a healthy daily ceiling for sodium consumption appears to be set too low, regardless of a person’s blood pressure level. “Low sodium intake reduces blood pressure modestly, compared to average intake, but low sodium intake also has other effects, including adverse elevations of certain hormones which may outweigh any benefits. The key question is not whether blood pressure is lower with very low salt intake, instead it is whether it improves health,” Mente said Dr. Martin O’Donnell, a co-author on the study and an associate clinical professor at McMaster University and National University of Ireland Galway, said: “This study adds to our understanding of the relationship between salt intake and health, and questions the appropriateness of current guidelines that recommend low sodium intake in the entire population.” “An approach that recommends salt in moderation, particularly focused on those with hypertension, appears more in-line with current evidence.” The study was funded from more than 50 sources, including the PHRI, the Heart and Stroke Foundation of Canada and the Canadian Institutes of Health Research
Newswise —  During the 2014-15 flu season, the poor match between the virus used to make the world’s vaccine stocks and the circulating seasonal virus yielded a vaccine that was less than 20 percent effective. While this year’s vaccine is a much better match to the circulating seasonal strains of influenza, the shifty nature of the virus and the need to pick the viruses used to make global vaccine stocks well before the onset of the flu season can make vaccine strain selection a shot in the dark. That process — dependent on the careful selection of circulating virus strains and the identification of mutations in the part of the virus that recognizes host cells — could soon be augmented by a new approach. It would more precisely forecast the naturally occurring mutations that help seasonal flu virus dodge the vaccine. Writing this week (May 23, 2016) in the journal Nature Microbiology, a team of researchers led by University of Wisconsin-Madison School of Veterinary Medicine virologist Yoshihiro Kawaoka describes a novel strategy to predict the antigenic evolution of circulating influenza viruses and give science the ability to more precisely anticipate seasonal flu strains. It would foster a closer match for the so-called “vaccine viruses” used to create the world’s vaccine supply. The approach Kawaoka and his colleagues used involved techniques commonly employed in virology for the past 30 years and enabled his group to assemble the 2014 flu virus before the onset of the epidemic. “This is the first demonstration that one can accurately anticipate in the lab future seasonal influenza strains,” explains Kawaoka, a UW-Madison professor of pathobiological sciences who also holds a faculty appointment at the University of Tokyo. “We can identify the mutations that will occur in nature and make those viruses available at the time of vaccine (virus) candidate selection.” Influenza depends on its ability to co-opt the cells of its host to replicate and spread. To gain access to host cells, the virus uses a surface protein known as hemagglutinin which, like a key to a lock, opens the cell to infection. Vaccines prevent infection by priming the immune system to create antibodies that effectively block the lock, prompting the virus to reengineer the hemagglutinin key through chance mutation. “Influenza viruses randomly mutate,” notes Kawaoka. “The only way the virus can continue to circulate in humans is by (accumulating) mutations in the hemagglutinin.” To get ahead of the constant pace of mutations in circulating flu viruses, Kawaoka’s group assembled libraries of human H1N1 and H3N2 viruses from clinical isolates that possessed various natural, random mutations in the hemagglutinin protein. The viruses were then mixed with antibodies to weed out only those that had accumulated enough mutations to evade the antibody. Because the sources of the viruses were known, the patterns of mutation could be mapped using “antigenic cartography.” The mapping, says Kawaoka, identifies clusters of viruses featuring novel mutations which, according to the new study, can effectively predict the molecular characteristics of the next seasonal influenza virus. Such a prediction, says Kawaoka, could then be used to more effectively develop the vaccine virus stockpiles the world needs each flu season. Each year the World Health Organization (WHO), comparing genetic sequence and antigenic data, makes recommendations about which circulating strains of influenza will make the best matching vaccine. The method described by Kawaoka and his colleagues is conceptually different in that it mimics the mutations that occur in nature and accelerates their accumulation in the critical hemagglutinin protein. “Our method may therefore improve the current WHO influenza vaccine selection process,” Kawaoka and his group conclude in the Nature Microbiology report. “These in vitro selection studies are highly predictive of the antigenic evolution of H1N1 and H3N2 viruses in human populations.”
Newswise —  Chatting on the phone with a “sleep coach” and keeping a nightly sleep diary significantly improve sleep quality and reduce insomnia in women through all stages of menopause, according to a new study published today in JAMA Internal Medicine. The study also found that such phone-based cognitive behavioral therapy significantly reduced the degree to which hot flashes, or vasomotor symptoms, interfered with daily functioning. This is good news for women who do not want to use sleeping pills or hormonal therapies to treat menopause-related insomnia and hot flashes, according to paper co-author Dr. Katherine Guthrie, a member of the Public Health Sciences and Clinical Research divisions at Fred Hutchinson Cancer Research Center. “Most women experience nighttime hot flashes and problems sleeping at some point during the menopause transition. Poor sleep leads to daytime fatigue, negative mood and reduced daytime productivity. When sleep problems become chronic — as they often do — there are also a host of negative physical consequences, including increased risk for weight gain, diabetes and cardiovascular disease,” Guthrie said. “Many women do not want to use sleeping medications or hormonal therapies to treat their sleep problems because of concerns about side-effect risks. For these reasons, having effective, non-pharmacological options to offer them is important.” The research, believed to be the first and the largest study to show that cognitive behavioral therapy for insomnia helps healthy women with hot flashes to sleep better, was conducted via MsFLASH, a research network funded by the National Institute on Aging that conducts randomized clinical trials focused on relieving the most common, bothersome symptoms of menopause. Guthrie serves as principal investigator of the Fred Hutch-based MsFLASH Data Coordinating Center. The clinical trial involved more than 100 Seattle-area women (between 40 and 65 years of age) with moderate insomnia who experienced at least two hot flashes a day. All of the women were asked to keep diaries to document their sleep patterns throughout the study and rated the quantity, frequency and severity of their hot flashes at the beginning of the study, at eight weeks and at 24 weeks. Half of the women were selected at random to take part in a cognitive behavioral therapy intervention that involved talking with a sleep coach for less than 30 minutes six times over eight weeks. Importantly, non-sleep specialists (a social worker and a psychologist) delivered the therapy. Before conducting the phone sessions they underwent a day of training in cognitive behavioral therapy techniques. “Since the intervention was delivered by non-sleep specialists over the phone, it potentially could be widely disseminated through primary and women’s health centers to women who do not have good access to behavioral sleep-medicine specialists or clinics,” said the paper’s first and corresponding author Dr. Susan McCurry, a clinical psychologist and research professor at the University of Washington School of Nursing. “Such an intervention would be much less expensive to deliver than traditional, in-person cognitive behavioral therapy protocols, which are typically six to eight sessions that are one hour each,” said McCurry, principal investigator of the randomized trial. The goal of the therapy was to get women to the point where they consistently estimated that they were asleep at least 85 percent of the time they were in bed. To this end, they were given specific sleep/wake schedules and were taught to limit time spent in bed at night, which ultimately helped them fall asleep more quickly and stay asleep. They also were taught “stimulus-control” rules, which are designed to strengthen the association between bed and sleep. “For example, the women were asked to not do anything in bed except sleep and have sex,” McCurry said. “So, no reading, watching television, checking email or paying bills in bed.” Stimulus control also emphasizes the importance of getting up at the same time each day and not napping during the day. The women received an educational booklet about menopause and were given information about how sleep normally changes with age. They learned to create bedtime routines and an environment conducive to sleep, such as turning off electronics at least 30 minutes prior to bed, not drinking caffeine or alcohol after dinner, and keeping their bedroom a slightly cool temperature. They also were taught a technique called “constructive worry” to practice when ruminating thoughts kept them awake at night. The other half of the women were assigned to a menopause education control intervention. These study participants also talked to a sleep coach with the same frequency and duration as the cognitive behavioral therapy group. They received information about women’s health, including diet and exercise, and how they related to hot flashes and sleep quality. The coaches reviewed their weekly sleep diaries with them and provided the same educational booklet about menopause that the other group received. The coaches did not, however, teach cognitive strategies such as constructive worry, and they made no recommendations regarding sleep/wake schedules or restricting time in bed. “This intervention was supportive but very nondirective,” McCurry said. The main outcomes of the study were that women in the cognitive behavioral therapy group experienced statistically significant, clinically meaningful, and long-term, sustained improvements in sleep as compared to the women in the menopause education group. The women who received cognitive behavioral therapy also fared better with regard to hot flashes. Although the frequency and severity of their hot flashes did not change, the women reported that the vasomotor symptoms interfered less with their daily functioning than prior to receiving such therapy. The researchers said that delivering this therapy by phone — a dissemination model similar to phone-based smoking-cessation programs that have proven to be effective — potentially allows it to be an efficient, cost-effective way to reach large populations of women seeking treatment for midlife sleep problems. They also said that these results support further research, such as testing the effectiveness of phone-based cognitive behavioral therapy for insomnia versus traditional pharmacological approaches. “This study demonstrates that it is possible to significantly improve the sleep of many women going through the menopausal transition without the use of sleeping medications or hormone therapies, even if hot flashes are waking them up at night. This is good news for millions of women who are suffering from poor sleep at this time of life,” Guthrie said. In addition to Guthrie, the MsFLASH research group is led by co-principal investigators Dr. Andrea LaCroix at the University of California, San Diego and Dr. Susan Reed of the University of Washington, both of whom are also co-authors on the paper. Editor’s note: To obtain a copy of the embargoed JAMA Internal Medicine paper, “Telephone-Based Cognitive Behavioral Therapy for Insomnia in Perimenopausal and Postmenopausal Women with Vasomotor Symptoms,” please contact the journal at mediarelations@jamanetwork.org or 312.464.5262. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer with minimal side effects. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first and largest cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Private contributions are essential for enabling Fred Hutch scientists to explore novel research opportunities that lead to important medical breakthroughs. For more information visit fredhutch.org or follow Fred Hutch on Facebook, Twitter or YouTube. The UW School of Nursing is one of the nation’s premiere nursing schools dedicated to addressing challenges in health care and improving the health of communities locally and globally. For almost 100 years, the UW School of Nursing has been a leader and innovator in nursing science and education. For more information about the #huskynurse community, visit nursing.uw.edu or follow us on Facebook, Twitter or Instagram. 
Newswise — Some adults learn a second language better than others, and their secret may involve the rhythms of activity in their brains. New findings by scientists at the University of Washington demonstrate that a five-minute measurement of resting-state brain activity predicted how quickly adults learned a second language. The study, published in the June-July issue of the journal Brain and Language, is the first to use patterns of resting-state brain rhythms to predict subsequent language learning rate. "We've found that a characteristic of a person's brain at rest predicted 60 percent of the variability in their ability to learn a second language in adulthood," said lead author Chantel Prat, a faculty researcher at the Institute for Learning & Brain Sciences and a UW associate professor of psychology. At the beginning of the experiment, volunteers — 19 adults aged 18 to 31 years with no previous experience learning French — sat with their eyes closed for five minutes while wearing a commercially available EEG (electroencephalogram) headset. The headset measured naturally occurring patterns of brain activity. The participants came to the lab twice a week for eight weeks for 30-minute French lessons delivered through an immersive, virtual reality computer program. The U.S. Office of Naval Research — who funded the current study — also funded the development of the language training program. The program, called Operational Language and Cultural Training System (OLCTS), aims to get military personnel functionally proficient in a foreign language with 20 hours of training. The self-paced program guides users through a series of scenes and stories. A voice-recognition component enables users to check their pronunciation. Watch a video demonstration of the language software: https://www.youtu.be/piA6dMkBroQ To ensure participants were paying attention, the researchers used periodic quizzes that required a minimum score before proceeding to the next lesson. The quizzes also served as a measure for how quickly each participant moved through the curriculum. At the end of the eight-week language program, participants completed a proficiency test covering however many lessons they had finished. The fastest person learned twice as quickly but just as well as the slower learners. The recordings from the EEG headsets revealed that patterns of brain activity related to language processes were linked the most strongly to the participants' rate of learning. So, should people who don't have this biological predisposition not even try to learn a new language? Prat says no, for two reasons. "First, our results show that 60 percent of the variability in second language learning was related to this brain pattern — that leaves plenty of opportunity for important variables like motivation to influence learning," Prat said. Second, Prat said it's possible to change resting-state brain activity using neurofeedback training — something that she's studying now in her lab. Neurofeedback is a sort of brain training regimen, through which individuals can strengthen the brain activity patterns linked to better cognitive abilities. "We're looking at properties of brain function that are related to being ready to learn well. Our goal is to use this research in combination with technologies such as neurofeedback training to help everyone perform at their best," she said. Ultimately, neurofeedback training could help people who want to learn a second language but lack the desirable brain patterns. They'd do brain training exercises first, and then do the language program. "By studying individual differences in the brain, we're figuring out key constraints on learning and information processing, in hopes of developing ways to improve language learning, and eventually, learning more generally," Prat said.
Newswise — Studying fruit flies, whose sleep is remarkably similar to that in people, Johns Hopkins researchers say they’ve identified brain cells that are responsible for why delaying bedtime creates chronic sleepiness. In a report on the research published online on May 19 in Cell, the scientists say they found a group of brain cells in charge of so-called sleep drive that becomes more active the longer flies are kept awake. The same mechanism, they say, also plays a role in putting the flies to sleep and keeping them that way. The findings may offer insight into human sleep disorders and open up new strategies to promote long-lasting sleep for those with chronic insomnia who don’t respond to available sleep drugs, they say. “Although fruit flies look very different from people on the surface, they actually share many of the same genes and even behaviors,” says Mark Wu, M.D., Ph.D., associate professor of neurology at the Johns Hopkins University School of Medicine. “And with what we believe is the first identification a mechanism behind the adjustable nature of sleep drive, researchers can look for the same processes in mammals, including, one day, in humans.” In their search for sleep-regulating cells, Wu’s team used genetic engineering to turn on small numbers of neurons in more than 500 fruit fly strains. They then measured how these flies slept when these neurons “fired.” Several strains continued to sleep for several hours even after they turned off the neurons, stopping them from firing and suggesting that the researchers triggered sleep drive in these flies, which led to the persistent sleepiness. Using fluorescent microscopy, the scientists then examined the fly brains to specifically pinpoint the identity and location of the sleep drive-inducing cells. The firing neurons were genetically engineered to glow green. They were found in a structure called the ellipsoid body (see photo) and are known as the R2 neurons. To pin down more of what was going on, the researchers blocked the neurons from firing by genetically engineering the R2 neurons to make tetanus toxin, which silences the cells. The flies with the silenced R2 neurons slept on their normal schedule, but when the flies with the silenced R2 neurons were deprived of sleep during the night by mechanically shaking their vial houses, they got about 66 percent less “rebound sleep” compared to control flies, suggesting that they felt less sleepy after sleep deprivation. Next, the researchers tested how fly R2 neurons behaved on their own in awake, sleeping or sleep-deprived fruit flies. They used tiny electrodes to measure the firing of the R2 neurons in well-rested, awake fruit flies; in fruit flies that were an hour into their sleep cycle; and in fruit flies after 12 hours of sleep deprivation. In the well-rested fruit flies, the neurons fired only about once per second and were the least active. In the sleeping fruit flies, the neurons fired almost four times a second. In the sleep-deprived fruit flies, the neurons were the most active, firing at about seven times per second. “These R2 neurons have higher firing rates the more sleep-deprived the fruit flies were and firing of these neurons puts flies to sleep, suggesting that we’ve identified the key cells responsible for sleep drive,” says Wu. Wu says it’s long been thought that getting to sleep requires an increase in sleep-promoting chemicals in specific parts of the brain as night and bedtime approach in the normal 24-hour sleep-wake cycle. However, he says, these chemicals last for only a few minutes at a time, so it has been puzzling how they can account for sleep drive that lasts hours. As an answer to this question, Wu and colleagues used a genetic technique to light up the places on the surface of the R2 neurons where they actively release small chemical neurotransmitters, sending information to neighboring cells. Compared to well-rested flies, sleep-deprived flies had an increase in the number and size of the places releasing the neurotransmitter, and they appeared much brighter. Wu says these changes in number of neurotransmitter release sites account for how the neurons are able to adjust over time using a system for sleep drive that works over a period of hours, rather than minutes, like the known sleep-promoting chemicals. This flexible system can adjust to times when the flies are sleep-deprived or when they are just nearing their normal bedtime. He adds that the sleep drive process in the R2 neurons works similarly to how memories are encoded in other types of neurons, where changes in the neuron’s information-sending and receiving parts adjust over time. “Figuring out how sleep drive works should help us one day figure out how to treat people who have an overactive sleep drive that causes them to be sleepy all the time and resistant to current therapies,” Wu says. Sha Liu, Qili Liu and Masashi Tabuchi of Johns Hopkins Medicine also contributed to this study. The research was funded by grants from the National Institute of Neurological Disorders and Stroke (R01 NS079584 and R21 NS088521) and a Burroughs-Wellcome Fund Career Award for Medical Scientists.
Newswise —  An experimental model uses genetics-guided biomechanics and patient-derived stem cells to predict what type of inherited heart defect a child will develop, according to authors of a new study in the journal Cell. A multi-institutional team developing the technology – and led by the Cincinnati Children’s Heart Institute – reports May 19 it would let doctors intervene earlier to help patients manage their conditions and help inform future pharmacologic treatment options. In laboratory tests, the model accurately predicts whether mouse models and stem-cell derived heart cells from human patients will develop a hypertrophic or dilated cardiomyopathy. “This technology would make it possible to predict the eventual cardiac phenotype in pediatric patients and help guide their treatment and future monitoring,” said Jeffery Molkentin, PhD, lead author and a researcher in the Division of Molecular Cardiovascular Biology at Cincinnati Children’s and the Howard Hughes Medical Institute. “It could help when counseling patients about athletic endeavors, in which sudden death can occur with hypertrophic cardiomyopathy. Or it could help decide whether certain patients should consider an implantable cardioverter defibrillator to prevent sudden death as they grow into young adulthood.” Inherited cardiomyopathy is a genetically diverse group of heart muscle diseases affecting about one of every 500 people. There are two primary clinical manifestations: hypertrophic cardiomyopathy (HCM) and dilated cardiomyopathy (DCM). The diseases involve nearly 1,500 different gene mutations in sarcomeres, the part of the heart muscle that generates tension and contraction. In HCM, the heart’s chambers and valves grow so that they are not symmetric. The dimension of the ventricular chamber is reduced, the interventricular septum thickens, and patients suffer from diastolic dysfunction (in which heart muscle doesn’t relax normally), causing an increased risk of sudden death from arrhythmia. With DCM, people have an enlarged left ventricular chamber accompanied by a lengthening of heart cells (myocytes) that results in reduced systolic function and eventually heart failure. Effective drug regimens to manage the conditions do not exist, although there is research looking for new drugs. The only effective treatment at present is a heart transplant. This leaves an urgent need to develop new technologies to manage, treat, cure or prevent the diseases, according to researchers. In developing the technology, scientists analyzed how sarcomeres generate tension coupled with alterations in calcium cycling, which is critical to heart function. The coupling of tension generation and calcium cycling is altered in patients with sarcomeric gene mutations. The alteration can be measured and then used to predict how the heart will change as disease progresses, Molkentin said. To study the influence of gene mutations on this process, researchers tested an array of genetically altered mice. The mouse models were either normal (wild type) mice or those expressing different gene mutations for various cardiomyopathies. This allowed researchers to examine tension generation and associated calcium cycling rates through heart muscle in a highly defined manner. That information was used to create a mathematical model for disease prediction that integrates the total tension generated by isolated cardiomyocytes. The tension-integrated, algorithm-based model was able to predict if hearts in mouse models would undergo hypertrophic or dilated cardiac growth. The scientists next tested the computational model on human cells from cardiomyopathy patients by using induced pluripotent stem cell (iPSC) technology. Reprogrammed and derived from actual patient skin fibroblast cells, iPSCs can become virtually any cell type in the human body and then be used for scientific investigation of disease properties. Molkentin and his colleagues generated patient-specific cardiomyocytes – which under a microscope can actually be seen pulsating rhythmically similar to a beating heart. Patient-derived iPSCs also carry the same genetic makeup (including mutations) as the person donating original starter cells. In the study, iPSC heart cells developed the same cardiomyopathy tension deficits as the patient’s own hearts. Researchers then used their heart defect prediction method to see how accurately it determined the heart defect type of specific cardiomyopathy patients. With collaborators at the Stanford University School of Medicine, Molkentin and his colleagues generated four lines of developing, early-stage patient-specific iPSC heart cells (cardiomyocytes). They report that their technology accurately determined the HCM vs. DCM heart defect of the donor patients. Researchers continue to develop and test the technology by using it to determine the cardiac disease state of patients with specific mutations in a sarcomere encoding gene. They caution the technology is years away from potential clinical use, pending further testing and refinement. The research team includes collaborators from Temple University (Philadelphia), the University of Washington (Seattle), Stanford University School of Medicine (Stanford, Calif.), Harvard Medical School-Brigham and Women’s Hospital (Boston) and the University of Minnesota Medical School (Minneapolis). Funding support for the study came in part from National Institutes of Health (R37HL60562) and the Howard Hughes Medical Institute. About Cincinnati Children’sCincinnati Children’s Hospital Medical Center ranks third in the nation among all Honor Roll hospitals in U.S.News and World Report’s 2015 Best Children’s Hospitals. It is also ranked in the top 10 for all 10 pediatric specialties, including a #1 ranking in pulmonology and #2 in cancer and in nephrology. Cincinnati Children’s, a non-profit organization, is one of the top three recipients of pediatric research grants from the National Institutes of Health, and a research and teaching affiliate of the University of Cincinnati’s College of Medicine. The medical center is internationally recognized for improving child health and transforming delivery of care through fully integrated, globally recognized research, education and innovation. Additional information can be found at http://www.cincinnatichildrens.org/default/. Connect on the Cincinnati Children’s blog, via Facebook and on Twitter.
Newswise —  Context plays a big role in our memories, both good and bad. Bruce Springsteen's "Born to Run" on the car radio, for example, may remind you of your first love -- or your first speeding ticket. But a Dartmouth- and Princeton-led brain scanning study shows that people can intentionally forget past experiences by changing how they think about the context of those memories. The findings have a range of potential applications centered on enhancing desired memories, such as developing new educational tools, or diminishing harmful memories, including treatments for post-traumatic stress disorder. The study appears in the journal Psychonomic Bulletin and Review. A PDF is available on request. Since Ancient Greece, memory theorists have known that we use context -- or the situation we're in, including sights, sounds, smells, where we are, who we are with -- to organize and retrieve our memories. But the Dartmouth- and Princeton-led team wanted to know whether and how people can intentionally forget past experiences. They designed a functional magnetic resonance imaging (fMRI) experiment to specifically track thoughts related to memories' contexts, and put a new twist on a centuries-old psychological research technique of having subjects memorize and recall a list of unrelated words. In the new study, researchers showed participants images of outdoor scenes, such as forests, mountains and beaches, as they studied two lists of random words, manipulating whether they were told to forget or remember the first list prior to studying the second list. "Our hope was the scene images would bias the background, or contextual, thoughts that people had as they studied the words to include scene-related thoughts," says lead author Jeremy Manning, an assistant professor of psychological and brain sciences at Dartmouth. "We used fMRI to track how much people were thinking of scene-related things at each moment during our experiment. That allowed us to track, on a moment-by-moment basis, how those scene or context representations faded in and out of people's thoughts over time." The study's participants were told to either forget or remember the random words presented to them interspersed between scene images. Right after they were told to forget, the fMRI showed that they "flushed out" the scene-related activity from their brains. "It's like intentionally pushing thoughts of your grandmother's cooking out of your mind if you don't want to think about your grandmother at that moment," Manning says. "We were able to physically measure and quantify that process using brain data." But when the researchers told participants to remember the studied list rather than forget it, this flushing out of scene-related thoughts didn't occur. Further, the amount that people flushed out scene-related thoughts predicted how many of the studied words they would later remember, which shows the process is effective at facilitating forgetting. The study has two important implications. "First, memory studies are often concerned with how we remember rather than how we forget, and forgetting is typically viewed as a 'failure' in some sense, but sometimes forgetting can be beneficial, too," Manning says. "For example, we might want to forget a traumatic event, such as soldiers with PTSD. Or we might want to get old information 'out of our head,' so we can focus on learning new material. Our study identified one mechanism that supports these processes." The second implication is more subtle but also important. "It's very difficult to specifically identify the neural representations of contextual information," Manning says. "If you consider the context you experience something in, we're really referring to the enormously complex, seemingly random thoughts you had during that experience. Those thoughts are presumably idiosyncratic to you as an individual, and they're also potentially unique to that specific moment. So, tracking the neural representations of these things is extremely challenging because we only ever have one measurement of a particular context. Therefore, you can't directly train a computer to recognize what context 'looks like' in the brain because context is a continually moving and evolving target. In our study, we sidestepped this issue using a novel experimental manipulation -- we biased people to incorporate those scene images into the thoughts they had when they studied new words. Since those scenes were common across people and over time, we were able to use fMRI to track the associated mental representations from moment to moment." ### Dartmouth Assistant Professor Jeremy Manning is available to comment at Jeremy.R.Manning@dartmouth.edu. The study, which included scientists at Bard College and the University of Illinois at Urbana-Champagne, was supported by the John Templeton Foundation, the National Institutes of Health and the National Science Foundation.
Newswise — "Video game addiction is more prevalent among younger men, and among those not being in a current relationship, than others," says, Cecilie Schou Andreassen, doctor of psychology and clinical psychologist specialist at Department of Psychosocial Science, University of Bergen (UiB). Schou Andreassen has carried out a study with more than 20 000 participants who answered questions related to videogame addiction. The study is published in the journal Psychology of Addictive Behaviors, of the American Psychological Association.Escape from psychiatric disorders The study showed that video game addiction appears to be associated with attention deficit/hyperactivity disorder, obsessive-compulsive disorder, and depression. "Excessively engaging in gaming may function as an escape mechanism for, or coping with, underlying psychiatric disorders in attempt to alleviate unpleasant feelings, and to calm restless bodies", Doctor Andreassen says. According to Doctor Andreassen, the large study shows some clear tendencies as to which people develop addictive use of social media. "The study implies that younger with some of these characteristics could be targeted regarding preventing development of an unhealthy gaming pattern."Sex difference in addiction The study also showed that addiction related to videogames and computer activities shows sex differences. "Men seem generally more likely to become addicted to online gaming, gambling, and cyber-pornography, while women to social media, texting, and online shopping", Schou Andreassen says.Seven Warning Signs The study uses seven criteria to identify video game addiction (developed by Lemmens et al., 2009), where gaming experiences last six months are scored on a scale from Never to Very often: • You think about playing a game all day long• You spend increasing amounts of time on games• You play games to forget about real life• Others have unsuccessfully tried to reduce your game use• You feel bad when you are unable to play• You have fights with others (e.g., family, friends) over your time spent on games• You neglect other important activities (e.g., school, work, sports) to play games Scoring high on at least four of the seven items may suggest that you are addicted to video gaming associated with impaired health, work, school and/or social relations. "However, most people have a relaxed relationship to video games and fairly good control,"Doctor Cecilie Schou Andreassen highlights. ###
HOW TO REMOVE A SPLINTER Newswise —  Everyone has been there. No sooner did you or your child touch that old wooden bench when a small sliver of wood slides into the skin – causing a surprising amount of pain. Fortunately, say dermatologists, splinters are easy to remove with the proper tools and technique. “Splinters come in all shapes and sizes, and they can really hurt,” said board-certified dermatologist Robert Sidbury, MD, MPH, FAAD, associate professor, department of pediatrics and division chief of dermatology, University of Washington School of Medicine. “To reduce pain and the possibility of an infection, splinters should be removed as quickly as possible.” To remove a splinter, Dr. Sidbury recommends the following tips: 1. Wash and dry the area: To prevent infection, wash your hands and the affected area with soap and water and gently pat your skin dry. 2. Inspect the splinter: If the splinter is very small, use a magnifying glass to see how big it is and which direction it entered the skin.3. Use tweezers to remove the splinter: If part of the splinter is sticking out, you can use tweezers to gently pull out the splinter. First, sterilize the tip of the tweezers using rubbing alcohol. Then, pull out the splinter in the same direction that it entered the skin. Never squeeze out a splinter, as this may cause it to break into smaller pieces that are harder to remove.4. Use a small needle to remove the splinter: If the entire splinter is embedded under the skin, you can use a small needle to remove it. First, sterilize the needle and a pair of tweezers using rubbing alcohol. Afterwards, look through a magnifying glass and use the needle to gently pierce the surface of the skin at one end of the splinter. This may require help from a friend or family member. Continue to use the needle to carefully push out part of the splinter. Once one end of the splinter is sticking out, use the tweezers to gently pull out the splinter.5. Clean and apply petroleum jelly: After the splinter has been removed, clean the area with soap and water and apply petroleum jelly. Keep the area covered with a bandage until it heals. “Most splinters can be safely removed at home, but some may require medical assistance,” said Dr. Sidbury. “See your doctor or a board-certified dermatologist if your splinter is very large, deep, located in or near your eye or if the area becomes infected.” These tips are demonstrated in “How to Remove a Splinter,” a video posted to the AAD website and YouTube channel. This video is part of the AAD’s “Video of the Month” series, which offers tips people can use to properly care for their skin, hair and nails. A new video in the series posts to the AAD website and YouTube channel each month. Headquartered in Schaumburg, Ill., the American Academy of Dermatology, founded in 1938, is the largest, most influential, and most representative of all dermatologic associations. With a membership of more than 18,000 physicians worldwide, the AAD is committed to: advancing the diagnosis and medical, surgical and cosmetic treatment of the skin, hair and nails; advocating high standards in clinical practice, education, and research in dermatology; and supporting and enhancing patient care for a lifetime of healthier skin, hair and nails. For more information, contact the AAD at 1-888-462-DERM (3376) or www.aad.org. Follow the AAD on Facebook (American Academy of Dermatology),Twitter (@AADskin), or YouTube (AcademyofDermatology). 
Newswise — Antibiotic-resistant bacteria most often are associated with hospitals and other health-care settings, but a new study indicates that chicken coops and sewage treatment plants also are hot spots of antibiotic resistance. The research, led by a team at Washington University School of Medicine in St. Louis, is published May 12 in Nature. The scientists surveyed bacteria and their capacity to resist antibiotics in a rural village in El Salvador and a densely populated slum on the outskirts of Lima, Peru. In both communities, the researchers identified areas ripe for bacteria to shuffle and share their resistance genes. These hot spots of potential resistance transmission included chicken coops in the rural village and a modern wastewater treatment plant outside Lima. “Bacteria can do this weird thing that we can’t — exchange DNA directly between unrelated organisms,” said senior author Gautam Dantas, PhD, an associate professor of pathology and immunology. “That means it’s relatively easy for disease-causing bacteria that are treatable with antibiotics to become resistant to those antibiotics quickly. If these bacteria happen to come into contact with other microbes that carry resistance genes, those genes can pop over in one step. We estimate that such gene-transfer events are generally rare, but they are more likely to occur in these hot spots we identified.” While the study was done in developing parts of the world, Dantas suggested ways the data could be relevant for the U.S. and other industrialized countries. If the chicken coops of subsistence farmers are hot spots of resistance gene transfer, he speculated that bacteria present in industrial farming operations — where chickens regularly receive antibiotics — would see even more pressure to share resistance genes. Dantas expressed concern about such bacteria getting into the food system. Further, the wastewater treatment facility the investigators studied in Lima is a modern design that uses technologies typical of such facilities around the world, including those in the U.S., suggesting these plants may be hot spots of antibiotic resistance transmission regardless of their locations. The study is the first to survey the landscape of bacteria and the genetics of their resistance across multiple aspects of an environment, including the people, their animals, the water supply, the surrounding soil, and samples from the sanitation facilities. While the densely populated slum surrounding Lima has a districtwide sewage system and modern wastewater treatment plant, the village in El Salvador has composting latrines. Rural villagers who rely on subsistence farming, and residents of densely populated, low-income communities surrounding cities make up a majority of the global population; yet their microbiomes are largely unstudied. Most similar studies to date have focused on heavily industrialized populations in the United States and Europe and on rare and so-called pristine communities of people living a traditional hunter-gatherer lifestyle. “Not only do the communities in our study serve as models for how most people live, they also represent areas of highest antibiotic use,” Dantas said. “Access to these drugs is over-the-counter in many low-income countries. Since no prescription is required, we expect antibiotic use in these areas to be high, putting similarly high pressure on bacteria to develop resistance to these drugs.” In general, Dantas and his colleagues found that resistance genes are similar among bacteria living in similar environments, with more genetic similarity seen between bacteria in the human gut and animal guts than between the human gut and the soil, for example. In addition, the researchers also found that bacteria that are closely related to one another have similar resistance genes, which might be expected as bacteria pass their genes from one generation to the next. “The general trends we found are consistent with our previous work,” Dantas said. “We were not terribly surprised by the resistance genes that track with bacterial family trees. On the other hand, the genes we found that break the hereditary trend are quite worrisome. Genes that are the exceptions to the rule — that are not similar to the surrounding DNA — are the ones that are most likely to have undergone a gene-transfer event. And they are the resistance genes at highest risk of future transmission into unrelated bacteria.” Of the locations sampled in the study, resistance genes that are most likely to be mobile and able to jump from one bacterial strain to another were found in the highest numbers in the chicken coops of villagers in El Salvador and in the outgoing “gray” water from the sewage treatment plant outside Lima. Not suitable for drinking, most of this water is released into the Pacific Ocean, and some is used to irrigate city parks, the researchers said. “Soils in the chicken coops we studied appear to be hot spots for the exchange of resistance genes,” Dantas said. “This means disease-causing bacteria in chickens are at risk of sickening humans and transferring their resistance genes in the process. Our study demonstrates the importance of public health guidelines that advise keeping animals out of cooking spaces.” As for the wastewater treatment plant, Dantas called it the perfect storm for transmitting antibiotic resistance genes. Such facilities are excellent at removing bacteria that are well-known for causing disease and can be grown in a petri dish, such as E. coli. But that leaves room for other types of bacteria to grow and flourish. “The system is not designed to do anything about environmental microbes that don’t make people sick,” Dantas said. “But some of these bacteria carry resistance genes that are known to cause problems in the clinic. We are inadvertently enriching this water with bacteria that carry resistance genes and then exposing people to these bacteria because the water is used to irrigate urban parks.” Dantas and his colleagues suspect that the antibiotic resistance they measured in microbes that survive the plant’s treatment process is driven by the presence of over-the-counter antibiotics in the sewage being treated. The researchers measured antibiotic levels before and after treatment, and while most of these drug residues are removed during the process, the fact that they’re present at the beginning favors the survival of bacteria that are resistant to them. “All the antibiotics we detected in the pre-treated water were among the top 20 sold in Peru,” Dantas said. “These findings have implications for public health, perhaps in designing future wastewater treatment plants and in making policy decisions about whether antibiotics should be available without a prescription.” ### This work was supported in part by the Edward Mallinckrodt Jr. Foundation; the Children’s Discovery Institute, grant number MD-II-2011-117; the National Institute of General Medical Sciences of the National Institutes of Health (NIH), grant number R01-GM099538; the National Science Foundation, grant number DBI-0521250; and the Department of Defense (DoD) through the National Defense Science and Engineering Graduate Fellowship. Pehrsson EC, Tsukayama P, Patel S, Mejia-Bautista M, Sosa-Soto G, Navarrete KM, Calderon M, Cabrera L, Hoyos-Arango W, Bertoli MT, Berg DE, Gilman RH, Dantas G. Interconnected microbiomes and resistomes in low-income human habitats. Nature. May 12, 2016. Washington University School of Medicine‘s 2,100 employed and volunteer faculty physicians also are the medical staff of Barnes-Jewish and St. Louis Children’s hospitals. The School of Medicine is one of the leading medical research, teaching and patient-care institutions in the nation, currently ranked sixth in the nation by U.S. News & World Report. Through its affiliations with Barnes-Jewish and St. Louis Children’s hospitals, the School of Medicine is linked to BJC HealthCare.