Follow our Breaking News
blog for the most current news on neurologic diseases and research. We want your input! Leave your comments at the end of each article.
Wednesday, October 01, 2014
by Rebecca Hiscott
The National Institutes of Health (NIH) announced the first recipients of its $46 million investment in the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative Tuesday. The funds will support a total of 58 projects from more than 100 researchers, which share the goal of developing new tools and technologies to contribute to a better understanding of neural circuit function and brain activity in healthy and diseased brains.
Ultimately, the NIH hopes these projects will lead to a better understanding of the way information travels through the brain, and will catalyze new treatment options and cures for neurologic illnesses such as Alzheimer’s disease, Parkinson’s disease, autism, epilepsy, and traumatic brain injury, NIH director Francis Collins, MD, PhD, said at a press conference in Washington, D.C., on Tuesday.
A focus of the initial years of the BRAIN Initiative is the development of next-generation tools for exploring how dynamic patterns of neural activity in the brain control thoughts, feelings and movements. Image courtesy of NIH.
The projects funded by the NIH’s initial investment are aimed at “classifying the myriad cell types in the brain, producing tools and techniques for analyzing brain cells and circuits, creating next-generation human brain imaging technology, developing methods for large-scale recordings of brain activity, and integrating experiments with theories and models to understand the functions of specific brain circuits,” Dr. Collins said.
These include a wearable positron emission tomography scanner for measuring brain activity during daily activities, a laser that could non-invasively turn clusters of brain cells on or off, a form of magnetic resonance imaging that would measure neuronal activity by detecting changes in calcium activity in the brain, and micro-electrodes that could measure neuronal activity and deliver small doses of a drug directly into the brain.
“Some of the projects here have the ability to transform how we study the brain, and new technologies and industries will likely be spawned. Possibly most importantly, the new insight we will ultimately gain from utilizing these new tools and technologies will enable researchers to develop new treatments, and even cures, for devastating disorders and diseases of the brain and nervous system,” Dr. Collins said.
Scientists funded by the NIH BRAIN Initiative will develop tools to simultaneously watch the unique firing patterns of many neurons in hopes of classifying them based on physical characteristics, such as size and shape, and functional characteristics, such as patterns of electrical activity. Image courtesy of Vincent Pieribone, Ph.D., John B. Pierce Laboratory, Inc.
Cori Bargmann, PhD, Torsten N. Wiesel Professor at the Rockefeller University and a member of the BRAIN Multi-Council Working Group, added that the initiative – which has been compared to the Human Genome Project in its ambition and scope – represents a crucial effort to understand the brain in a holistic manner and at the most fundamental levels.
“Instead of just studying the brain one disease at a time, the NIH will support a sustained effort to examine the whole system and the features that emerge from the brain as a whole,” she said. “There are projects here to explore deep parts of the brain that are effectively terra incognita, that are crucial for processes like action and emotion and motivation. At the same time, some of the highest parts of the brain involved in complex, fluid operations like working memory are going to be addressed in detail.”
President Obama launched the BRAIN Initiative in April 2013, and in addition to the NIH, the National Science Foundation, the US Food and Drug Administration, and the Defense Advanced Research Projects Agency pledged funds to the effort. The four agencies have contributed a combined $110 million for fiscal year 2014.
In addition, the Obama administration announced Tuesday that private institutions such as Google, General Electric, the Simons Foundation, and several universities will align $270 million of their neuroscience research dollars with the goals of the BRAIN Initiative.
An NIH report published in June of this year called for an even more ambitious investment: a 12-year, $4.5 billion program that would aim to meet the BRAIN Initiative’s goals by 2025. “At NIH, we agree that $4.5 billion is a realistic estimate of what will be required for this kind of a moonshot,” Dr. Collins said.
“We feel a little bit like Galileo looking at the skies for the first time with the telescope when we are looking at the brain with new tools,” Dr. Bargmann said of this first wave of investments. “Already, the emergence of tools for looking at circuits and networks are showing patterns of activity in the brain we never knew existed.”
To read more Neurology Today coverage of the BRAIN Initiative, browse our archives here: http://bit.ly/1rBEU8u. To see the full list of projects funded by the NIH BRAIN Initiative, visit their website here: http://1.usa.gov/1pEpgVV.
Tuesday, September 30, 2014
by Rebecca Hiscott
Adults who notice their memory slipping may be at an increased risk of developing dementia later in life, even if they show no clinical signs of cognitive impairment, according to a study published in the Sept. 24 online issue of Neurology.
Image via amenclinicsphotosac on Flickr
The longitudinal study followed 531 participants from the ongoing Biologically Resilient Adults in Neurological Studies (BRAiNS) cohort from the University of Kentucky’s Alzheimer’s Disease Center, which looks at initially cognitively normal adults over the age of 60 who have agreed to donate their brains for analysis after death.
The participants underwent annual neurocognitive assessments, and were asked to report whether they had noticed any significant memory problems since their last visit. Those who reported problems with memory but were not formally diagnosed with dementia or mild cognitive impairment (MCI) were classified as having “subjective memory complaints.” Approximately half of the cohort (243 participants) died during the study, and researchers biopsied their brains to look for clinical signs of Alzheimer’s disease or dementia.
Of the 296 participants who reported subjective memory complaints over the course of the study (55.7 percent), 72 progressed to MCI and 42 progressed to dementia, while 127 died. Those who reported cognitive complaints had a nearly three-fold risk of being diagnosed with cognitive impairment or dementia later in life (p<0.0001), the researchers found. Nearly 69 percent of those diagnosed with MCI had reported cognitive complaints beforehand, while 79.6 percent of those who progressed to dementia had done the same.
Those who complained of memory problems were also more likely to have the hallmark plaques and tangles of Alzheimer’s disease in their brains upon autopsy, even when no formal diagnosis was made.
“Our study adds strong evidence to the idea that memory complaints are common among older adults and are sometimes indicators of future memory and thinking problems. Doctors should not minimize these complaints and should take them seriously,” study author Richard J. Kryscio, PhD, a professor of statistics and chair of the department of biostatistics at the University of Kentucky in Lexington, KY, said in a news release.
“What’s notable about our study is the time it took for this transition to dementia or clinical impairment to occur – about 12 years for dementia and nine years for clinical impairment – after the memory complaints began,” he added.
The average time between the first reported memory complaint and clinical diagnosis of MCI was 9.2 years, and those who progressed directly to dementia did so roughly 6.1 years after first reporting a memory complaint. Women and people with hypertension were more likely to progress directly to dementia. In addition, women, smokers, and those with high blood pressure tended to progress to dementia more rapidly, while in contrast, women taking hormone replacement therapy progressed to dementia at a slower pace.
Other known risk factors for Alzheimer’s disease and dementia were also found to have an impact on the time lag between reporting a memory complaint and progressing to dementia. Adults who possessed a variant APOE4 (apolipoprotein E4) genotype, which has been linked to Alzheimer’s disease, and those who had a history of smoking progressed to dementia more quickly than non-smokers and those without the genetic risk factor. Adults with the APOE4 allele had double the odds of progressing to cognitive impairment (p=0.036) later in life.
In cases where modifiable risk factors for Alzheimer’s disease are present, such as a history of smoking, “these findings suggest that there may be a window for intervention before a diagnosable problem shows up,” Dr. Kryscio said. But, he added, “unfortunately we do not yet have preventive therapies for Alzheimer’s disease and other illnesses that cause memory problems.”
For more coverage of dementia and Alzheimer’s disease research, browse our archives here: http://bit.ly/NT-Dementia.
Monday, September 29, 2014
by Susan Fitzgerald
A study using sophisticated imaging technology to detect neurochemical changes in the brains of newly infected HIV patients found that signs of brain inflammation were present very early on, but patients who started antiretroviral therapy showed no worsening of inflammatory markers in follow-up scans.
Scanning electron micrograph of HIV particles infecting a human H9 T cell, colorized in blue, turquoise, and yellow. Image via NIAID on Flickr.
The findings, published in the Sept. 26 online edition of Neurology, will likely lend support to the argument that HIV patients should start antiretroviral therapy sooner rather than later.
“Our findings suggest that inflammation and gliosis, the presumed substrates of later neurologic injury, initially manifest during the first year of HIV infection and worsen during early infection in the absence of treatment,” the researchers concluded. “Additionally, our findings suggest that the introduction of ART [antiretroviral therapy] can alter this trajectory such that markers of inflammatory changes are no longer increasing, possibly limiting the extent of neurological injury.”
The study was conducted using proton magnetic resonance imaging spectroscopy (1H-MRS) to detect levels of several cerebral metabolites that are associated with brain inflammation and neuronal health — glutamate (Glu), N-acetylaspartate (NAA), myo-inositol (MI), and choline-containing metabolites (Cho). The levels were measured in relationship to another biomarker, creatine.
Fifty-three men who had been infected with HIV within the previous 12 months — on average, they were infected for 3.5 months — were given baseline brain scans and then followed longitudinally with a total of 154 repeat scans over a median time of six months. During that period, 23 of the 53 HIV patients began taking antiretroviral drugs, independent of the study.
The research team, which included researchers from Yale University, the University of California, San Francisco, and Indiana University, found that even within several months of HIV infection, the study participants showed increased signs of brain inflammation and other biochemical processes associated with neuronal injury. But for the 23 participants who began retroviral therapy, the signs of inflammation generally did not progress as they did for the untreated participants.
“Our observations consistently show longitudinal increases of the inflammatory brain metabolite markers Cho/Cr and MI/Cr that suggest worsening inflammation during early untreated HIV,” the researchers reported. “ART initiation during the first year of infection attenuated the increase of these inflammatory cerebral markers, but it did not appear to reverse them within the follow-up period.”
Serena Spudich, MD, an associate professor of neurology at Yale and principal investigator for the study, told Neurology Today that “what we saw was a flattening of the curve. The level of inflammation did not continue to increase. What this is implying is that the earlier you start antiretroviral therapy, you may be able to attenuate some of processes underlying neurologic damage from HIV.”
The data in the current study were collected as part of a cohort study that recruited volunteers at UCSF from 2005 to 2011. Andrew Young, a fourth-year Yale medical student who was the lead author for the study, said the project was unusual because it involved newly diagnosed HIV patients with recent transmission who were then tracked over time.
Young said that while the study found that antiretroviral therapy did not lower the levels of inflammatory markers — suggesting “stabilization without reversibility” — it would take a longer-term study to determine whether continued ART may eventually reduce inflammation and the risk of developing HIV-associated neurocognitive disorders.
Look for the full story and discussion in the Oct. 16 issue of Neurology Today. For more coverage of HIV’s effect on the brain, browse our archives here: http://bit.ly/1xtNbQG.
Friday, September 26, 2014
by Tom Valeo
An off-the-shelf drug used to treat cancer also reverses the effects of a mutation that causes a type of muscular dystrophy, according to Swiss researchers reporting their results in the Aug. 20 online edition of Science Translational Medicine.
The mutation disrupts the production of dysferlin, a large repair protein that reseals routine damage in muscle cell membranes. Depending on the location of the mutation, the protein itself could remain functional, but the proteasome detects the aberrant amino acid and destroys the entire protein.
Dystrophin analysis in Duchenne and Becker muscular dystrophies. Immunofluorescence stain for dystrophin. The sections illustrate a normal subject (N), a patient with Duchenne muscular dystrophy (D) and one with Becker muscular dystrophy (B).
“Usually a missense mutation in one amino acid leaves a full-length protein, but we saw many dysferlinopathy patients with missense mutations who were left with no dysferlin at all, which implied that the cell’s quality control system was degrading the protein,” said lead author Michael Sinnreich, MD, PhD, a professor of neurology and head of the Neuromuscular Center at University Hospital in Basel, Switzerland. “Our idea was that if we could inhibit that degradation, we could salvage missense-mutated dysferlin, which would then still be functionally active. Maybe imperfect dysferlin would be good enough.”
After tracing the degradation process to the proteasome, the researchers decided to test bortezomib (Velcade), a proteasome inhibitor already approved by the US Food and Drug Administration (FDA) and its Swiss counterpart, Swissmedic, to treat multiple myeloma and mantle cell lymphoma. They applied it to myoblasts from dysferlinopathy patients with missense mutations, and then injured the plasma membranes with a laser. Left alone, the membranes did not reseal, but the muscle cells that received the protease inhibitor produced large amounts of dysferlin, enabling the membranes to reseal normally.
With the proteasome inhibitor already approved, the researchers were able to skip animal experiments and perform a proof-of-principle study on three muscular dystrophy patients who carry a dysferlin missense mutation. Within 36 hours after a single dose, a muscle biopsy revealed levels of dysferlin expression that reached 30-40 percent of normal.
The study opens the door to novel treatment possibilities, Dr. Sinnreich said.
“With a small molecule like a proteasome inhibitor we can access the entire musculature of the body and can get re-expression of a hitherto absent protein,” he said. “It’s amazingly simple when compared to treatment strategies such as viral-mediated gene transfer or stem cell therapy.”
Dysferlin is a very large protein, comprising 2,080 amino acids. The gene contains no mutational “hot spots,” according to Dr. Sinnreich, so a mutation can occur at any one of hundreds of sites. Treatment with proteasome inhibitors works only with certain mutations, however.
“If you salvage a protein with a missense mutation at an active site, the protein won't be functional,” Dr. Sinnreich said. Therefore, he and his colleagues plan to collect myoblast cultures from muscular dystrophy patients around the world to map the mutations responsive to treatment with proteasome inhibitors.
This technique may work with other genetic diseases as well, Dr. Sinnreich added. Based on in vitro or mouse data, additional disease candidates include muscular dystrophy due to lamin A/C or sarcoglycan mutations, glycogen storage disorders such as Pompe disease, Charcot-Marie-Tooth disease, cystic fibrosis, cystathionine beta synthase deficiency (homocystinuria), and familial cancers due to mutations in DNA mismatch repair proteins.
The treatment strategy faces a serious obstacle — proteasome inhibitors appear to cause or at least promote peripheral neuropathy. The researchers hope to counteract this side-effect by reducing the dose to once a week — multiple myeloma patients receive it twice a week — and by giving it subcutaneously instead of intravenously. The same amount of dysferlin is produced with either form of delivery, but intravenous administration is associated with more peripheral neuropathy.
In addition, new proteasome inhibitors with fewer side-effects have been developed. The researchers showed that the FDA-approved proteasome inhibitor carfilzomib, from Onyx (recently acquired by Amgen), and ixazomib, which is being developed for oral application by Takeda, were able to reseal membranes of patient-derived muscle cells in vitro. Both these proteasome inhibitors are reported to have a better side-effect profile than bortezomib.
Look for the full discussion in the Oct. 2 issue of Neurology Today. For more coverage of muscular dystrophy research, browse our archives here: http://bit.ly/NT-MuscularDystrophy.
Thursday, September 25, 2014
by Rebecca Hiscott
Adults who follow the DASH (Dietary Approach to Stop Hypertension) or Mediterranean diet – both of which emphasize the consumption of fruits, vegetables, whole grains, fish and legumes, and limit foods high in saturated fat and sugar – may have slower cognitive decline as they age, a study published online Sept. 17 in Neurology found.
Since both the DASH and Mediterranean diets have been shown in past research to protect against hypertension, obesity, cardiovascular disease, and diabetes – conditions that have been linked to increased risk of dementia and cognitive decline – researchers from the Rush University Medical Center in Chicago, IL, looked directly at the link between diet and cognitive function in aging adults.
The Mediterranean diet emphasizes olive oil as the primary source of fat, as well as high consumption of fish and moderate consumption of wine with meals, the study authors noted. The DASH diet includes a higher consumption of dairy products. Image via MonaLMtz on Flickr.
The study cohort consisted of 826 participants from the Rush Memory and Aging Project, an ongoing study of neurological conditions in aging adults. Participants completed a 144-item food frequency questionnaire at baseline, and their diets were scored based on adherence to DASH or Mediterranean diet principles, with a score of 0-10 for accordance with the DASH diet and 0-55 for accordance with the Mediterranean diet.
Patients also received annual cognitive assessments over an average of 4.1 years; these consisted of 19 cognitive tests evaluating episodic memory, semantic memory, working memory, perceptual speed, and visuospatial ability.
Although cognition did decline over time in these patients, as expected, the researchers found that for every single-point increase in DASH diet scores, participants had a slower rate of global cognitive decline of 0.007 standard units (p=0.03). Every 1-unit increase in Mediterranean diet scores was associated with a slower rate of cognitive decline of 0.002 standardized units (p=0.01).
For both diets, the slower decline in global cognition was associated with episodic memory, which governs autobiographical events such as times, dates, and places, and semantic memory, which recalls facts, meanings, understandings, and general knowledge.
“These findings were not wholly unexpected because many of the food components that constitute the selected diet scores are those we have previously observed to be related to cognitive change,” the authors wrote. These food components include whole grains, vegetables, nuts, and legumes, which are staples of both the DASH and Mediterranean diets and which have been shown in past studies to reduce cognitive decline, as well as cardiovascular events, stroke, and heart disease.
The authors noted that their findings were based on an observational study, and warned that the association between diet and cognitive function could not yet be interpreted as a cause-effect relationship. The current study, they wrote, “attest[s] to the need for more research to determine whether a food-based approach can attenuate neurocognitive decline in older populations at risk of Alzheimer disease or vascular cognitive impairment.”
For more coverage of the link between diet and cognitive decline, browse our archives here: http://bit.ly/ZN2ENI.