News

Newswise — ORLANDO— Insufficient sleep, a common problem that has been linked to chronic disease risk, might also be an unrecognized risk factor for bone loss. Results of a new study will be presented Saturday at the Endocrine Society’s 99th annual meeting in Orlando, Fla. The study investigators found that healthy men had reduced levels of a marker of bone formation in their blood after three weeks of cumulative sleep restriction and circadian disruption, similar to that seen in jet lag or shift work, while a biological marker of bone resorption, or breakdown, was unchanged. “This altered bone balance creates a potential bone loss window that could lead to osteoporosis and bone fractures,” lead investigator Christine Swanson, M.D., an assistant professor at the University of Colorado in Aurora, Colo., said. Swanson completed the research while she was a fellow at Oregon Health & Science University in Portland, Ore., with Drs. Eric S. Orwoll and Steven A. Shea. “If chronic sleep disturbance is identified as a new risk factor for osteoporosis, it could help explain why there is no clear cause for osteoporosis in the approximately 50 percent of the estimated 54 million Americans with low bone mass or osteoporosis,” Swanson said. Inadequate sleep is also prevalent, affecting more than 25 percent of the U.S. population occasionally and 10 percent frequently, the Centers for Disease Control and Prevention report. The 10 men in this study were part of a larger study that some of Swanson’s co-authors conducted in 2012 at Brigham and Women’s Hospital in Boston, Mass. That study evaluated health consequences of sleep restriction combined with circadian disruption. Swanson defined circadian disruption as “a mismatch between your internal body clock and the environment caused by living on a shorter or longer day than 24 hours.” Study subjects stayed in a lab, where for three weeks they went to sleep each day four hours later than the prior day, resulting in a 28-hour “day.” Swanson likened this change to “flying four time zones west every day for three weeks.” The men were allowed to sleep only 5.6 hours per 24-hour period, since short sleep is also common for night and shift workers. While awake, the men ate the same amounts of calories and nutrients throughout the study. Blood samples were obtained at baseline and again after the three weeks of sleep manipulation for measurement of bone biomarkers. Six of the men were ages 20 to 27, and the other four were ages 55 to 65. Limited funding prevented the examination of serum from the women in this study initially, but the group plans to investigate sex differences in the sleep-bone relationship in subsequent studies. After three weeks, all men had significantly reduced levels of a bone formation marker called P1NP compared with baseline, the researchers reported. This decline was greater for the younger men than the older men: a 27 percent versus 18 percent decrease. She added that levels of the bone resorption marker CTX remained unchanged, an indication that old bone could break down without new bone being formed. “These data suggest that sleep disruption may be most detrimental to bone metabolism earlier in life, when bone growth and accrual are crucial for long-term skeletal health,” she said. “Further studies are needed to confirm these findings and to explore if there are differences in women.” This study received funding from the National Institute of Arthritis and Musculoskeletal and Skin Diseases, the National Institute on Aging and the Medical Research Foundation of Oregon.# # #
Newswise — Genetic testing of tumor and blood fluid samples from people with and without one of the most aggressive forms of skin cancer has shown that two new blood tests can reliably detect previously unidentifiable forms of the disease. Researchers at NYU Langone Medical Center and its Perlmutter Cancer Center, who led the study, say having quick and accurate monitoring tools for all types of metastatic melanoma, the medical term for the disease, may make it easier for physicians to detect early signs of cancer recurrence. The new blood tests, which take only 48 hours, were developed in conjunction with Bio-Rad Laboratories in Hercules, Calif. Currently, the tests are only available for research purposes. The new tools are the first, say the study authors, to identify melanoma DNA in the blood of patients whose cancer is spreading and who lack defects in either the BRAF or NRAS genes, already known to drive cancer growth. Together, BRAF and NRAS mutations account for over half of the 50,000 cases of melanoma diagnosed each year in the United States, and each can be found by existing tests. But the research team estimates that when the new tests become available for use in clinics, the vast majority of all melanomas will be detectable. “Our goal is to use these tests to make more informed treatment decisions and, specifically, to identify as early as possible when a treatment has stopped working, cancer growth has resumed, and the patient needs to switch therapy,” says senior study investigator and dermatologist David Polsky, MD, PhD. Polsky presents his team’s latest findings at the annual meeting of the American Association for Cancer Research on April 2 in Washington, D.C. The new tests, says Polsky, the Alfred W. Kopf, MD, Professor of Dermatologic Oncology at NYU Langone and director of its pigmented lesion section in the Ronald O. Perelman Department of Dermatology, monitor blood levels of DNA fragments, known as circulating tumor DNA (ctDNA), that are released into the blood when tumor cells die and break apart. Specifically, the test detects evidence of changes in the chemical building blocks (or mutations) of a gene that controls telomerase reverse transcriptase (TERT), a protein that helps cancer cells maintain the physical structure of their chromosomes. Polsky says the detected changes occur in mutant building blocks, in which a cytidine molecule in the on-off switch for the TERT gene is replaced by another building block, called thymidine. Either mutation, C228T or C250T, results in the switch being stuck in the “on” position, helping tumor cells to multiply. According to Polsky, the blood tests may have advantages over current methods for monitoring the disease because the tests avoid the radiation exposure that comes with CT scans, and the tests can be performed more easily and more often. The Bio-Rad tests, once clinically validated, are also likely to gain widespread use quickly, he says, because his previous research had shown that similar blood tests for BRAF and NRAS mutations worked better in identifying new tumor growth than existing blood tests for the protein lactate dehydrogenase. Lactate dehydrogenase levels may spike during aggressive tumor growth, but can also rise as a result of other diseases and biological functions. As part of the ongoing study, researchers checked results from the new tests against 10 tumor samples taken from NYU Langone patients diagnosed with and without metastatic melanoma. They also tested four blood plasma samples (the liquid portion of blood) — from NYU Langone patients with and without the disease. Blood test results matched correctly in all cases known to be either positive or negative for metastatic melanoma. Successful detection occurred, they say, for samples with as little as 1 percent of mutated ctDNA in a typical blood plasma sample of 5 milliliters. Meanwhile, TERT mutations were absent in tests of normal blood plasma and tonsil tissue. Polsky says further study of the new blood tests are planned to gauge their use in monitoring progression of the aggressive cancer, and to more quickly determine when switching to an alternative therapy is warranted, as well as whether the tests can used to detect other types of cancer, such as brain tumors, that also have TERT mutations. Funding support for the study was provided by National Cancer Institute grant R21 CA198495, with in-kind support from Bio-Rad, which provided chemical supplies. Besides Polsky, other NYU Langone/Perlmutter researchers involved in the study were lead study investigators Broderick Corless, BS; and Gregory Chang, MBA; and study co-investigators Mahrukh Sayeda, MS; and Iman Osman, MD. Additional research support was provided by study co-investigators Samantha Cooper, PhD; and George Karlin-Neumann, PhD, at Bio-Rad Laboratories. Media Inquiries: David March212-404-3528david.march@nyumc.org
Newswise — ORLANDO— Mothers who binge drink before they become pregnant may be more likely to have children with high blood sugar and other changes in glucose function that increase their risk of developing diabetes as adults, according to a new study conducted in rats. The results will be presented Sunday at the Endocrine Society’s 99th annual meeting in Orlando, Fla. “The effects of alcohol use during pregnancy on an unborn child are well known, including possible birth defects and learning and behavior problems. However, it is not known whether a mother’s alcohol use before conception also could have negative effects on her child’s health and disease susceptibility during adulthood,” said principal investigator Dipak Sarkar, Ph.D., DPhil, a distinguished professor at Rutgers University in New Brunswick, N.J., and director of its endocrine research program. Binge drinking is common in the United States. Among alcohol users 18 to 44 years old, 15 percent of nonpregnant women and 1.4 percent of pregnant women report that they binge drank in the past month, according to a 2012 phone survey from the U.S. Centers for Disease Control and Prevention (CDC). For women, binge drinking is the equivalent of four or more drinks in about two hours. To assess the effects of preconception alcohol use, Sarkar, with doctoral candidate Ali Al-Yasari, MS, and their colleagues, conducted a study, funded by the National Institutes of Health, in rats, whose basic processes of glucose function are similar to those in humans, Sarkar said. For four weeks, they gave female rats a diet containing 6.7 percent alcohol, which raised their blood alcohol levels to those of binge drinking in humans. Alcohol was then removed from the rats’ diet, and they were bred 3 weeks later, equal to several months in humans. Adult offspring of these rats were compared with control offspring: the offspring of rats that did not receive alcohol before conception. (One control group received regular rat chow and water, and the other received a nonalcoholic liquid diet equal in calories to the alcohol feedings.) After the rats’ offspring reached adulthood, the researchers used standard laboratory techniques to monitor their levels of blood glucose and insulin and two other important hormones, glucagon and leptin. Glucagon stimulates the liver to convert glycogen (stored glucose) into glucose to move to the blood, making blood glucose levels higher. Although the main function of leptin is inhibiting appetite, it also reduces the glucose-stimulated insulin production by the pancreas. The research team found that, compared with both groups of control offspring, the offspring of rats exposed to alcohol before conception had several signs of abnormal glucose homeostasis (function). Altered glucose homeostasis reportedly included increased blood glucose levels, decreased insulin levels in the blood and pancreatic tissue, reduced glucagon levels in the blood while being increased in pancreatic tissue, and raised blood levels of leptin. Additionally, the researchers said they found evidence that preconception alcohol exposure increased the expression of some inflammatory markers in pancreatic tissue. Al-Yasari said this might lower insulin production and action on the liver that increases blood glucose levels. The overexpression of inflammatory markers may be how pre-pregnancy alcohol use altered normal glucose homeostasis in the offspring, he stated. “These findings suggest that [the effects of] a mother’s alcohol misuse before conception may be passed on to her offspring,” Al-Yasari said. “These changes could have lifelong effects on the offspring’s glucose homeostasis and possibly increase their susceptibility to diabetes.”# # #
Newswise — (NEW YORK —) Knee replacement surgery for patients with osteoarthritis, as currently used, provides minimal improvements in quality of life and is economically unattractive, according to a study led by Mount Sinai researchers and published today in the BMJ. However, if the procedure was only offered to patients with more severe symptoms, its effectiveness would rise, and its use would become economically more attractive as well, the researchers said. “Given its limited effectiveness in individuals with less severely affected physical function, performance of total knee replacement in these patients seems to be economically unjustifiable,” said Bart Ferket, MD, PhD, Assistant Professor, Department of Population Health Science and Policy at the Icahn School of Medicine at Mount Sinai and lead author on the study. “Considerable cost savings could be made by limiting eligibility to patients with more symptomatic knee osteoarthritis. Our findings emphasize the need for more research comparing total knee replacement with less expensive, more conservative interventions, particularly in patients with less severe symptoms.” About 12 percent of adults in the United States are affected by osteoarthritis of the knee. The annual rate of total knee replacement has doubled since 2000, mainly due to expanding eligibility to patients with less severe physical symptoms. The number of procedures performed each year now exceeds 640,000 at a total annual cost of about $10.2 billion, yet health benefits are higher in those with more severe symptoms before surgery.A team of researchers from the Icahn School of Medicine at Mount Sinai and Erasmus University Medical Center in Rotterdam, the Netherlands, set out to evaluate the impact of total knee replacement on quality of life in people with knee osteoarthritis. They also wanted to estimate differences in lifetime costs and quality adjusted life years or QALYs (a measure of years lived and health during these years) according to level of symptoms.They analyzed data from two U.S. cohort studies: one with 4,498 participants aged 45-79 with or at high risk for knee osteoarthritis from the Osteoarthritis Initiative (OAI), and the other involving 2,907 patients from the Multicenter Osteoarthritis Study (MOST). OAI participants were followed up for nine years and MOST patients were followed up for two years. Quality of life was measured using a recognized score of physical and mental function, known as SF-12, and using some osteoarthritis-specific quality of life scores.They found that quality of life outcomes generally improved after knee replacement surgery, although the effect was small. The improvements in quality of life outcomes were found higher when patients with lower physical scores before surgery were operated on. In a cost-effectiveness analysis, current practice was more expensive and in some cases seemed even less effective compared with scenarios in which total knee replacement was performed only in patients with lower physical function. “Our findings show opportunity for optimizing delivery of total knee replacement in a cost-effective way, finding the patients who will benefit the most, delivering the treatment at the correct point in their disease progression, and optimizing the cost so we can deliver the benefit to all who need it,” said Madhu Mazumdar, PhD, Director of the Institute for Healthcare Delivery Science at the Mount Sinai Health System, Professor of Biostatistics, Department of Population Health Science and Policy at the Icahn School of Medicine at Mount Sinai, and co-author of the study. Funding for the cohort studies used in the analysis was provided by the National Institutes of Health, Merck Research Laboratories, Novartis Pharmaceuticals Corporation, GlaxoSmithKline, and Pfizer. Dr. Ferket is supported by the American Heart Association. The researchers have no competing interests to disclose. About the Mount Sinai Health SystemThe Mount Sinai Health System is an integrated health system committed to providing distinguished care, conducting transformative research, and advancing biomedical education. Structured around seven hospital campuses and a single medical school, the Health System has an extensive ambulatory network and a range of inpatient and outpatient services—from community-based facilities to tertiary and quaternary care. The System includes approximately 7,100 primary and specialty care physicians; 12 joint-venture ambulatory surgery centers; more than 140 ambulatory practices throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and 31 affiliated community health centers. Physicians are affiliated with the renowned Icahn School of Medicine at Mount Sinai, which is ranked among the highest in the nation in National Institutes of Health funding per investigator. The Mount Sinai Hospital is in the “Honor Roll” of best hospitals in America, ranked No. 15 nationally in the 2016-2017 “Best Hospitals” issue of U.S. News & World Report. The Mount Sinai Hospital is also ranked as one of the nation’s top 20 hospitals in Geriatrics, Gastroenterology/GI Surgery, Cardiology/Heart Surgery, Diabetes/Endocrinology, Nephrology, Neurology/Neurosurgery, and Ear, Nose & Throat, and is in the top 50 in four other specialties. New York Eye and Ear Infirmary of Mount Sinai is ranked No. 10 nationally for Ophthalmology, while Mount Sinai Beth Israel, Mount Sinai St. Luke's, and Mount Sinai West are ranked regionally. Mount Sinai’s Kravis Children’s Hospital is ranked in seven out of ten pediatric specialties by U.S. News & World Report in "Best Children's Hospitals." For more information, visit http://www.mountsinai.org/, or find Mount Sinai on Facebook, Twitter and YouTube. # # #
Newswise — CLEVELAND— Bill Kochevar grabbed a mug of water, drew it to his lips and drank through the straw. His motions were slow and deliberate, but then Kochevar hadn’t moved his right arm or hand for eight years. And it took some practice to reach and grasp just by thinking about it. Kochevar, who was paralyzed below his shoulders in a bicycling accident, is believed to be the first person with quadriplegia in the world to have arm and hand movements restored with the help of two temporarily implanted technologies. A brain-computer interface with recording electrodes under his skull, and a functional electrical stimulation (FES) system* activating his arm and hand, reconnect his brain to paralyzed muscles. Holding a makeshift handle pierced through a dry sponge, Kochevar scratched the side of his nose with the sponge. He scooped forkfuls of mashed potatoes from a bowl—perhaps his top goal—and savored each mouthful. “For somebody who’s been injured eight years and couldn’t move, being able to move just that little bit is awesome to me,” said Kochevar, 56, of Cleveland. “It’s better than I thought it would be.” A video of Kochevar can be found at: https://youtu.be/OHsFkqSM7-AKochevar is the focal point of research led by Case Western Reserve University, the Cleveland Functional Electrical Stimulation (FES) Center at the Louis Stokes Cleveland VA Medical Center and University Hospitals Cleveland Medical Center (UH). A study of the work will be published in the The Lancet March 28 at 6:30 p.m. U.S. Eastern time. “He’s really breaking ground for the spinal cord injury community,” said Bob Kirsch, chair of Case Western Reserve’s Department of Biomedical Engineering, executive director of the FES Center and principal investigator (PI) and senior author of the research. “This is a major step toward restoring some independence.” When asked, people with quadriplegia say their first priority is to scratch an itch, feed themselves or perform other simple functions with their arm and hand, instead of relying on caregivers. “By taking the brain signals generated when Bill attempts to move, and using them to control the stimulation of his arm and hand, he was able to perform personal functions that were important to him,” said Bolu Ajiboye, assistant professor of biomedical engineering and lead study author. Technology and training The research with Kochevar is part of the ongoing BrainGate2* pilot clinical trial being conducted by a consortium of academic and VA institutions assessing the safety and feasibility of the implanted brain-computer interface (BCI) system in people with paralysis. Other investigational BrainGate research has shown that people with paralysis can control a cursor on a computer screen or a robotic arm (www.braingate.org). “Every day, most of us take for granted that when we will to move, we can move any part of our body with precision and control in multiple directions and those with traumatic spinal cord injury or any other form of paralysis cannot,” said Benjamin Walter, associate professor of Neurology at Case Western Reserve School of Medicine, Clinical PI of the Cleveland BrainGate2 trial and medical director of the Deep Brain Stimulation Program at UH Cleveland Medical Center. “The ultimate hope of any of these individuals is to restore this function,” Walter said. “By restoring the communication of the will to move from the brain directly to the body this work will hopefully begin to restore the hope of millions of paralyzed individuals that someday they will be able to move freely again.” Jonathan Miller, assistant professor of neurosurgery at Case Western Reserve School of Medicine and director of the Functional and Restorative Neurosurgery Center at UH, led a team of surgeons who implanted two 96-channel electrode arrays—each about the size of a baby aspirin—in Kochevar’s motor cortex, on the surface of the brain. The arrays record brain signals created when Kochevar imagines movement of his own arm and hand. The brain-computer interface extracts information from the brain signals about what movements he intends to make, then passes the information to command the electrical stimulation system. To prepare him to use his arm again, Kochevar first learned how to use his brain signals to move a virtual-reality arm on a computer screen. “He was able to do it within a few minutes,” Kirsch said. “The code was still in his brain.” As Kochevar’s ability to move the virtual arm improved through four months of training, the researchers believed he would be capable of controlling his own arm and hand. Miller then led a team that implanted the FES systems’ 36 electrodes that animate muscles in the upper and lower arm. The BCI decodes the recorded brain signals into the intended movement command, which is then converted by the FES system into patterns of electrical pulses. The pulses sent through the FES electrodes trigger the muscles controlling Kochevar’s hand, wrist, arm, elbow and shoulder. To overcome gravity that would otherwise prevent him from raising his arm and reaching, Kochevar uses a mobile arm support, which is also under his brain’s control. New CapabilitiesEight years of muscle atrophy required rehabilitation. The researchers exercised Kochevar’s arm and hand with cyclical electrical stimulation patterns. Over 45 weeks, his strength, range of motion and endurance improved. As he practiced movements, the researchers adjusted stimulation patterns to further his abilities. Kochevar can make each joint in his right arm move individually. Or, just by thinking about a task such as feeding himself or getting a drink, the muscles are activated in a coordinated fashion. When asked to describe how he commanded the arm movements, Kochevar told investigators, “I’m making it move without having to really concentrate hard at it…I just think ‘out’…and it goes.” Kocehvar is fitted with temporarily implanted FES technology that has a track record of reliable use in people. The BCI and FES system together represent early feasibility that gives the research team insights into the potential future benefit of the combined system.Advances needed to make the combined technology usable outside of a lab are not far from reality, the researchers say. Work is underway to make the brain implant wireless, and the investigators are improving decoding and stimulation patterns needed to make movements more precise. Fully implantable FES systems have already been developed and are also being tested in separate clinical research. Kochevar welcomes new technology—even if it requires more surgery—that will enable him to move better. “This won’t replace caregivers,” he said. “But, in the long term, people will be able, in a limited way, to do more for themselves.” The investigational BrainGate technology was initially developed in the Brown University laboratory of John Donoghue, now the founding director of the Wyss Center for Bio and Neuroengineering in Geneva, Switzerland. The implanted recording electrodes are known as the Utah array, originally designed by Richard Normann, Emeritus Distinguished Professor of Bioengineering at the University of Utah. The report in today’s Lancet is the result of a long-running collaboration between Kirsch, Ajiboye and the multi-institutional BrainGate consortium. Leigh Hochberg, MD, PhD, a neurologist and neuroengineer at Massachusetts General Hospital, Brown University and the VA RR&D Center for Neurorestoration and Neurotechnology in Providence, Rhode Island, directs the pilot clinical trial of the BrainGate system and is a study co-author. “It’s been so inspiring to watch Mr. Kochevar move his own arm and hand just by thinking about it,” Hochberg said. “As an extraordinary participant in this research, he’s teaching us how to design a new generation of neurotechnologies that we all hope will one day restore mobility and independence for people with paralysis.” Other researchers involved with the study include: Francis R. Willett, Daniel Young, William Memberg, Brian Murphy, PhD, and P. Hunter Peckham, PhD, from Case Western Reserve; Jennifer Sweet, MD, from UH; Harry Hoyen, MD,and Michael Keith, MD, from MetroHealth Medical Center and CWRU School of Medicine; and John Simeral, PhD from Brown University and Providence VA Medical Center. *CAUTION: Investigational Device. Limited by Federal Law to Investigational Use. Media contacts: Bill Lubinger, Case Western Reserve University: 216-368-4443; william.lubinger@case.eduGeorge Stamatis, University Hospitals: 216-346-9323; george.stamatis@uhhosppitals.org Mary Buckett, Cleveland FES Center: 216-231-3257 or 440-667-5367;mbuckett@FEScenter.org   SEE ORIGINAL STUDY
Newswise — BIRMINGHAM, Ala. – A new study from the University of Alabama at Birmingham suggests that women at risk of preterm delivery, from as early as 23 weeks of pregnancy, should receive corticosteroids due to strong associations with a lower rate of death and serious illness for their babies. The study, published in the British Medical Journal, says that very premature babies seem to benefit the most from the steroids, even those born at 23 weeks of gestation. “Indeed, the benefits of antenatal corticosteroids were substantially larger for infants born at the lowest gestations, including less than 28-week infants, for which data from randomized controlled trials are most limited,” said Wally Carlo, senior investigator of the study and director of the UAB Division of Neonatology. Compared to babies born at term, premature babies carry a greater risk of death or serious complications after birth, with problems tending to be more severe the earlier a baby is born. Infants exposed to antenatal corticosteroids had lower mortality and lower rates of brain bleeding. Colm Travers, M.D., third-year fellow in the UAB Division of Neonatology, and a team of researchers analyzed data for 117,941 infants born between 23 and 34 weeks of gestation from 2009 to 2013 at 300 neonatal intensive care units across the United States. Death or major illness was analyzed by gestational age and exposure to antenatal corticosteroids, adjusting for factors such as birth weight, sex, mode of delivery and multiple births.  The researchers found that exposure to antenatal corticosteroids was associated with a significantly lower rate of death before discharge from hospital at each gestation compared with infants without exposure. They also found that the number of infants needed to treat with antenatal corticosteroids to prevent one death before discharge increased from six at 23 and 24 weeks of gestation to 798 at 34 weeks of gestation, suggesting that infants born at the lowest gestational ages benefit most, even those born at 23 weeks. The rate of survival without major illness while in hospital was also higher among infants exposed to antenatal corticosteroids at the lowest gestations. “Among infants born from 23 to 34 weeks’ gestation, antenatal exposure to corticosteroids compared with no exposure was associated with lower mortality and morbidity at most gestations,” said Travers, principal investigator of the study. “This study highlights for the first time that infants at the lowest gestations seem to benefit the most from exposure to antenatal corticosteroids.”  The authors point out that this is an observational study, so no firm conclusions can be drawn about cause and effect, and they outline some limitations could have introduced bias. Current guidelines recommend giving corticosteroids to at-risk women from 23 to 34 weeks of pregnancy. However, the benefits for lower mortality and morbidity for infants born at less than 28 weeks had been less clear. “This study supports the administration of antenatal corticosteroids in women with threatened preterm labor from 23 to 34 weeks’ gestation, particularly at the earliest gestations in this range,” Carlo said. About UABKnown for its innovative and interdisciplinary approach to education at both the graduate and undergraduate levels, the University of Alabama at Birmingham is the state of Alabama’s largest employer and an internationally renowned research university and academic medical center; its professional schools and specialty patient-care programs are consistently ranked among the nation’s top 50. UAB’s Center for Clinical and Translational Science is advancing innovative discoveries for better health as a two-time recipient of the prestigious Center for Translational Science Award. Find more information at www.uab.edu and www.uabmedicine.org. EDITOR’S NOTE: The University of Alabama at Birmingham is a separate, independent institution from the University of Alabama, which is located in Tuscaloosa. Please use University of Alabama at Birmingham on first reference and UAB on all subsequent references. VIDEO: www.youtube.com/uabnews TEXT: www.uab.edu/news TWEETS: www.twitter.com/uabnews
Newswise — More than 35 million Americans are trying to quit smoking. Smoking cigarettes causes 480,000 premature deaths each year due mainly to a two-fold risk of cardiovascular disease and a 20-fold risk of lung cancer. In a commentary published in the current issue of the American Journal of Medicine, researchers from the Charles E. Schmidt College of Medicine at Florida Atlantic University reassure clinicians and their patients that varenicline, whose brand name is Chantix, is a safe and effective way to achieve smoking cessation and that failure to use this drug has caused preventable heart attacks and deaths from cardiovascular disease. In 2006, varenicline was approved as a safe and effective means to quit smoking and achieved permanent quit rates of approximately 25 percent. In 2009, however, varenicline received a black box warning by the FDA based on their adverse event reports of neuropsychiatric symptoms like depression and thoughts of suicide. There were plausible alternative explanations including that nearly half of the subjects had psychiatric histories, 42 percent were taking psychotropic drugs and about 42 percent were suffering from depression. Nonetheless, since then, there has been a 76 percent decline in the number of prescriptions dispensed from a peak of about 2 million in the last quarter of 2007 to about 531,000 in the first quarter of 2014. In their commentary, the FAU researchers emphasize that, until recently, the totality of randomized evidence on varenicline had been restricted to eight small trials, which did not demonstrate a hazard. The researchers caution that the reliable detection of small to moderate risks and benefits of drug therapies requires cogent data from large-scale randomized trials designed a priori to test the hypothesis. Such a large randomized trial was recently completed that included both apparently healthy individuals as well as those with severe mental illness. The trial was conducted for 12 weeks on about 8,000 long-term smokers and included equal subgroups of those without as well as with psychiatric disorders. In subjects without psychiatric disorders, those treated with varenicline had less neuropsychiatric symptoms and in subjects without psychiatric disorders there were no increases in these symptoms. Both groups of participants assigned at random to varenicline achieved significantly higher abstinence rates at 12 weeks than those assigned to placebo, nicotine patch or bupropion. Just a few months ago, the FDA removed the black box warning from varenicline. The commentary was coauthored by Dianna Gaballa, a fourth-year medical student; Joanna Drowos, D.O., M.P.H., an associate professor of integrated medical science and associate chair of the Department of Integrated Medical Science; and Charles H. Hennekens, M.D., Dr.P.H., the first Sir Richard Doll Professor and senior academic advisor to the dean, all in FAU’s Charles E. Schmidt College of Medicine. “The existing totality of evidence suggests an urgent need to increase the use of varenicline in the general population as well as in those with serious mental illness who on average die about 20 years earlier than the general population, in part, because their smoking rates may be as high as 75 percent,” said Hennekens. Quitting smoking significantly reduces risks of cardiovascular disease beginning within a matter of months and reaching the non-smoker status within a few years, even among older adults. For lung and other cancers, however, reductions do not even begin to emerge for years after quitting and, even after 10 years, quitters achieve death rates only about midway between the continuing smoker and non-smoker. “For reducing risks of cardiovascular disease it’s never too late to quit, but to reduce risks of cancer, it’s never too early,” said Hennekens. The authors speculate that if use of varenicline had not plummeted by 76 percent following the black box warning in 2009, perhaps 17,000 premature deaths from cardiovascular disease may have been avoided each year during the last few years. Public health efforts and effective cessation treatments including behavioral counseling and medication have resulted in a 14 percent decrease in smoking in the U.S. while the rates are markedly increasing in developing countries. According to the U.S. Centers for Disease Control and Prevention, heart disease is the leading killer among men and women causing approximately 600,000 deaths each year. Among the numerous honors and recognitions Hennekens has received include the Ochsner Award for reducing premature deaths from cigarettes in 2014. From 1995 to 2005, Science Watch ranked him as the third most widely cited medical researcher in the world and five of the top 20 were his former trainees and/or fellows. In 2012, Science Heroes ranked Hennekens No. 81 in the history of the world for having saved more than 1.1 million lives. In 2016, he was ranked the No. 14 “Top Scientist in the World” with an H-index of 173. -FAU- About the Charles E. Schmidt College of Medicine: FAU’s Charles E. Schmidt College of Medicine is one of 145 accredited medical schools in the U.S. The college was launched in 2010, when the Florida Board of Governors made a landmark decision authorizing FAU to award the M.D. degree. After receiving approval from the Florida legislature and the governor, it became the 134th allopathic medical school in North America. With more than 70 full and part-time faculty and more than 1,300 affiliate faculty, the college matriculates 64 medical students each year and has been nationally recognized for its innovative curriculum. To further FAU’s commitment to increase much needed medical residency positions in Palm Beach County and to ensure that the region will continue to have an adequate and well-trained physician workforce, the FAU Charles E. Schmidt College of Medicine Consortium for Graduate Medical Education (GME) was formed in fall 2011 with five leading hospitals in Palm Beach County. In June 2014, FAU’s College of Medicine welcomed its inaugural class of 36 residents in its first University-sponsored residency in internal medicine. About Florida Atlantic University:Florida Atlantic University, established in 1961, officially opened its doors in 1964 as the fifth public university in Florida. Today, the University, with an annual economic impact of $6.3 billion, serves more than 30,000 undergraduate and graduate students at sites throughout its six-county service region in southeast Florida. FAU’s world-class teaching and research faculty serves students through 10 colleges: the Dorothy F. Schmidt College of Arts and Letters, the College of Business, the College for Design and Social Inquiry, the College of Education, the College of Engineering and Computer Science, the Graduate College, the Harriet L. Wilkes Honors College, the Charles E. Schmidt College of Medicine, the Christine E. Lynn College of Nursing and the Charles E. Schmidt College of Science. FAU is ranked as a High Research Activity institution by the Carnegie Foundation for the Advancement of Teaching. The University is placing special focus on the rapid development of critical areas that form the basis of its strategic plan: Healthy aging, biotech, coastal and marine issues, neuroscience, regenerative medicine, informatics, lifespan and the environment. These areas provide opportunities for faculty and students to build upon FAU’s existing strengths in research and scholarship. For more information, visit www.fau.edu. SEE ORIGINAL STUDY    
Newswise — Four to 12 percent of people undergoing cataract surgery to replace a cloudy lens with a clear artificial one develop posterior capsule opacification (PCO). Also referred to as secondary cataract, PCO is the most common vision-compromising complication of cataract surgery. In a new study, scientists find that the growth factor TGF-beta may play a role in the formation of secondary cataract, suggesting a direction for research into strategies to prevent it. The study appears in Molecular Biology of the Cell and was funded by the National Eye Institute, part of the National Institutes of Health. “Cataract surgery is the most common ocular surgery performed. Preventing posterior capsule opacification would spare thousands of people from needing laser capsulotomy, a procedure with potential risks that costs the U.S. more than $150 million annually,” said Houmam Araj, Ph.D., lens and cataract program director at NEI. “In addition to shedding light on how secondary cataracts form, these findings present some exciting insights into the growth factor TGF-beta that have implications far beyond the eye,” he said. During cataract surgery, a hole is made in the front of the lens capsule, the outer layer that surrounds the lens. A surgeon inserts an ultrasonic probe that emulsifies the lens, which is then drawn out through the hole. A new, artificial lens is then placed in the lens capsule. Cataract surgery typically improves vision with long-lasting results. However, some patients develop secondary cataract in the following weeks and months. This happens when residual lens epithelial cells left behind from the old, emulsified lens accumulate at the back of the lens capsule and transform into one of two types of cells that cause problems: lens fiber cells and myofibroblasts. Lens fiber cells become bloated with proteins called crystallins, which causes them to scatter light as it passes through the lens. Myofibroblasts generate large amounts of molecules outside the cell, creating a matrix that causes the lens capsule to wrinkle, which obstructs vision. To better understand the factors that cause residual lens epithelial cells to differentiate into these two distinct cell types, the study’s lead investigator, Linda S. Musil, Ph.D., of Oregon Health & Science University, Portland, and her team cultured purified embryonic chick lens cells to determine the effects of various growth factors on cell differentiation. Previous studies suggest that the growth factor TGF-beta becomes highly active in response to surgery-induced injury. “Though it occurs on a small scale, cataract surgery creates a huge wound response,” she said, referring to the cascade of molecularly-driven processes that orchestrate tissue repair and prevent infections following an injury. Previous studies have shown that when active TGF-beta is overexpressed in the intact lens, myofibroblasts form. By contrast, little is known about the factors that promote the formation of lens fiber cells. Musil’s team found that adding active TGF-beta to their culture system induced the lens epithelial cells to differentiate into either myofibroblasts or lens fiber cells, while adding a TGF-beta inhibitor reduced differentiation. Other growth factors failed to induce similar cell differentiation. Both cell types were found in the same cultures, often right next to each other, as has been reported in people with secondary cataract. In addition, the presence of TGF-beta was associated with the migration of lens cells, a requirement for accumulation at the back of the lens capsule and secondary cataract. Next, they assessed whether TGF-beta had a direct effect on lens epithelial cell differentiation, or if it was indirectly enabled by molecular signaling downstream of TGF-beta activation. Using their lens culture system, they found that blocking the downstream p38 kinase pathway with an inhibitor, prevented myofibroblast formation. Similarly, inhibiting the MTOR pathway prevented the formation of lens fiber cells. Musil said that future studies are needed to assess the contribution of other factors on cell fate. One possibility is cell density. Looking at the distribution of cell types within the culture, lens fiber cells tend to congregate in densely packed multilayered islands while myofibroblasts form a single layer fringe surrounding those islands. From a basic science perspective, the findings shed new light on TGF-beta. In cancer, TGF-beta is known to first suppress and then later promote tumor growth as disease progresses, a phenomenon known as the TGF-beta paradox. This latest study suggests that TGF-beta plays yet another dual role in a very different type of disorder. “Our studies show that TGF-beta is capable of concurrently directing two disparate cell fates in non-transformed lens epithelial cells,” Musil said. In the interest of finding a treatment to prevent secondary cataract, the team used their lens culture system to test drugs for their ability to block PCO. They found that the multikinase inhibitor rebastinib, a drug that is currently in phase I clinical trials for cancer, prevented TGF-beta from inducing cell migration as well as the formation of myofibroblasts in both chick lens cells and in rat lens explants. Work is in progress to find agents that inhibit both myofibroblast and lens fiber cell formation. Such agents may be included in artificial lenses to prevent the formation of secondary cataract, Musil said. The study was supported by grants from NEI (R01EY022113 to Linda Musil and R01EY017146 to Judith West-Mays). Reference: Boswell BA, Korol A, West-Mays JA, Musil LS. Dual function of TGF-beta in lens epithelial cell fate: implications for secondary cataract. Mol Biol Cell. Feb 16 2017. doi:10.1091/mbc.E16-12-0865. ###
Newswise — In a research effort that merged genetics, physics and information theory, a team at the schools of medicine and engineering at The Johns Hopkins University has added significantly to evidence that large regions of the human genome have built-in variability in reversible epigenetic modifications made to their DNA. In a report on the research published March 27 in Nature Genetics, the team says the findings also suggest that such epigenetic variability is a major factor in the ability of cancer cells to proliferate, adapt and metastasize. "These results suggest that biology is not as deterministic as many scientists think," says Andrew Feinberg, M.D., M.P.H., the King Fahd Professor of Medicine, Oncology, and Molecular Biology and Genetics at the Johns Hopkins University School of Medicine and director of the Center for Epigenetics in the Institute for Basic Biomedical Sciences. "If so, they could have major implications for how we treat cancer and other aging-related diseases." Epigenetic modifications, achieved along the genome by the chemical attachment of methyl molecules, or tags, to DNA, are reversible changes that alter which genes are turned on or off in a given cell without actually altering the DNA sequence of the cell. Such changes enable a complex organism, like a human, to have a wide range of different tissues that all still have the exact same genetic template. However, in some studies with laboratory mice, Feinberg had observed that these epigenetic tags varied considerably among the mice even when comparing the same type of tissue in animals that have been living in the exact same conditions. "These weren't minor differences, and some very important genes were involved," Feinberg says. Feinberg, who is also a Bloomberg Distinguished Professor of Engineering and Public Health at The Johns Hopkins University, suspected that this variation might be an adaptive feature by which built-in epigenetic randomness would give some cells an advantage in rapidly changing environments. To find out if that was the case, he teamed up with John Goutsias, Ph.D., professor of electrical and computer engineering at the Johns Hopkins Whiting School of Engineering, to find a way to measure this controlled type of randomness, scientifically termed epigenetic stochasticity, by using the information-theoretic concept of Shannon entropy. Using a mathematical model known as the Ising model, invented to describe phase transitions in statistical physics, such as how a substance changes from liquid to gas, the Johns Hopkins researchers calculated the probability distribution of methylation along the genome in several different human cell types, including normal and cancerous colon, lung and liver cells, as well as brain, skin, blood and embryonic stem cells. As Goutsias explains, this distribution reflects the chance that a particular region of a genome will be methylated in a population of similar cells. In areas of low randomness, this probability would mostly be 0 or 100 percent, but in areas of high randomness, the numbers would be 50-50 or thereabouts. The analysis revealed that the human genome is organized into large pieces of low or high epigenetic stochasticity, and that these regions correspond to areas of chromosomes that are structurally different in the cell nucleus. Feinberg thinks that a main function of a cell's nucleus might be to partition the genome to make sure that regions of low or high stochasticity are well-defined. The other significant finding of the study, says Garrett Jenkinson, Ph.D., assistant research scientist at the Johns Hopkins Whiting School of Engineering who carried out much of the analyses, was that this variability goes haywire in cancer cells, which may display significant regional differences in methylation stochasticity compared to normal cells. Based on the evolutionary idea that targeted epigenetic stochasticity can improve adaptation, these observations could explain how cancer cells are good at evading chemotherapy treatments and spreading from one part of the body to another, he adds. "Researchers have understood the importance of epigenetics in driving cancer growth, but the focus has been trying to reverse epigenetic changes to specific genes," Feinberg says. "We need to readjust and think more broadly about the epigenetic process as a whole." Looking at ways to reverse aberrant changes in variability to make cancer cells more epigenetically controlled should be a target for therapy, he adds. Earlier this year, Feinberg led a study that considered this view of epigenetics in metastatic pancreatic cancer cells. Using an experimental drug called 6-aminonicotinamide, his group reversed the large-scale epigenetic changes that enabled the tumor cells in mice to metastasize and slow the growth of further tumors. In addition to Goutsias, Feinberg, and Jenkinson, Elisabet Pujadas, a graduate student in the Center for Epigenetics at Johns Hopkins University School of Medicine contributed to this study. This study was supported by grants from the National Institutes of Health (R01CA054358, DP1ES022579, and AG021334) and National Science Foundation (CCF-1217213 and CCF-1656201).
Newswise — BETHESDA, MD,  – Cancer prevention advocates and researchers have designated March 22nd as National Lynch Syndrome Awareness Day. Carol A. Burke, MD, FACG, President of the American College of Gastroenterology (ACG), a gastroenterologist specializing in hereditary colorectal cancer syndromes at the Cleveland Clinic in Cleveland, OH, offered commentary and guidance for GI physicians on Lynch Syndrome in an ACG communication to her colleagues. According to Dr. Burke, up to approximately 1 million Americans are estimated to live with Lynch Syndrome, and it is believed that as many as 500,000 Lynch Syndrome carriers have no idea they are at risk of disease. Lynch Syndrome is the most common inherited colorectal cancer syndrome. It accounts for three to five percent of all cases of colorectal cancer and 10 to 15 percent of colorectal cancers diagnosed in patients younger than age 50. Lynch Syndrome is also the most common inherited cause of endometrial cancer. It causes up to three percent of all endometrial cancers. Individuals with Lynch Syndrome have a substantially increased lifetime cumulative risk of colorectal cancer and other Lynch Syndrome-associated cancers including endometrial, ovarian, gastric, small bowel and urothelial tract. “Physicians should be reminded on this day to redouble our efforts to create opportunities in our practice to make a proactive diagnosis of Lynch Syndrome by utilizing simple and directed family cancer history taking, ensuring that all colorectal cancers in our patients are tested for evidence of microsatellite instability, a hallmark of Lynch Syndrome, seeing that appropriate patients and families are offered genetic testing, and providing them with the education and management schema to prevent death from Lynch Syndrome-related cancers,” Dr. Burke advised. “Prevention of the occurrence and death from some of these cancers due to Lynch Syndrome is possible with recognition of the syndrome and preventive strategies, such as colonoscopy and prophylactic surgery,” she added. About Lynch SyndromeLynch Syndrome, named after Dr. Henry T. Lynch who characterized the syndrome in 1966, is an autosomal dominant hereditary cancer syndrome. It is caused by a germline mutation in one of the DNA mismatch repair (MMR) or EPCAM genes. Impact of Lynch Syndrome on FamiliesLynch Syndrome affects families. Once it is diagnosed by genetic testing in one relative, other at-risk family members should undergo genetic testing and be offered appropriate management to decrease their risk of cancer. Importance of Molecular and Genetic TestingData show that the family history alone is not accurate enough to select which patients should be tested for Lynch Syndrome. Some studies have shown that up to 40 percent of patients with Lynch Syndrome do not meet clinical criteria for the syndrome. In her commentary, Dr. Burke emphasizes that molecular and genetic testing is required to make the diagnosis of Lynch Syndrome. Currently, the national recommendation is that all colorectal cancer tumors should be tested for evidence of MMR deficiency to reveal Lynch Syndrome. This is called “universal testing” and is endorsed by many organizations and agencies. Many commercial laboratories offer genetic testing for Lynch Syndrome. Genetic counselors are invaluable in the process of risk assessment and providing genetic testing. The National Society of Genetic Counselors can provide information for a genetic counselor in your area (www.nsgc.org). Dr. Burke offers a warning to her fellow gastroenterologists, “Be sure your Lynch Syndrome patients know that this is a ‘family business’ and provide them with recommendations to share with their family members, a lifesaving gesture!” Read Dr. Burke’s guidance for GI clinicians on Lynch Syndrome via the ACG Blog (acgblog.org/2017/03/20/acg-presidents-blog-lynch-syndrome-awareness-day). Access patient education on Lynch Syndrome from the ACG patient website (patients.gi.org/topics/lynch-syndrome). Access resources from Lynch Syndrome International on its website (lynchcancers.com) and Facebook page (facebook.com/LynchSyndromeInternational). About the American College of Gastroenterology: Founded in 1932, the American College of Gastroenterology (ACG) is an organization with an international membership of more than 14,000 individuals from 85 countries. The College's vision is to be the pre-eminent professional organization that champions the evolving needs of clinicians in the delivery of high-quality, evidence-based and compassionate health care to gastroenterology patients. The mission of the College is to advance world-class care for patients with gastrointestinal disorders through excellence, innovation and advocacy in the areas of scientific investigation, education, prevention and treatment. www.gi.org. For more information on colorectal cancer screening, gi.org/ColonCancer. Follow ACG on Twitter @AmCollegeGastro.