About this blog
Is Modern living leading to a ‘hidden epidemic’ of neurological disease?
Modern living could be responsible for an ‘almost epidemic’ increase in neurological brain disease, according to new research from Bournemouth University.
Published in the USA journal Surgical Neurology International the study compared 21 Western countries between 1989 and 2010 and found that dementias are starting a decade earlier than they used to in adults.
Furthermore deaths caused by neurological disease have risen significantly in adults aged 55-74, and for adults 75+ the rate has virtually doubled in every Western country in just the last 20 years.
In the US, the problem is particularly acute; neurological deaths in male over 75s have nearly trebled and females rose more than five-fold.
For the first time since records began, more elderly US women died of brain disease than cancer.
Professor Colin Pritchard of Bournemouth University led the study said “The rate of increase in such a short time suggests a silent or even a `hidden’ epidemic, in which environmental factors must play a major part, not just ageing. Modern living produces multi-interactional environmental pollution but the changes in human morbidity, including neurological disease is remarkable and points to environmental influences”.
Professor Prichard continued, “Furthermore are the practical implications for families trying to cope as front-line services are being swamped. For example, the remarkable increase in Motor Neurone Disease in the UK, as well as the earlier dementias. Exemplified in a new charity `Young Dementia UK, who report that many of their clients are in their late 40’s and early 50’s- something unthinkable twenty years ago”
“In part, some of the results are explained by more effective treatments for cancer and heart disease, with advances in medicine making such physical illnesses easier to treat, whilst there have been less advances in the treatment of neurological conditions”
“Crucially it is not just because people are living longer to get diseases they previously would not have lived long enough to develop but older people are developing neurological disease more than ever before.
The environmental changes in the last 20 years have seen increases in the human environment of petro-chemicals – air transport- quadrupling of motor vehicles, insecticides and rises in background electro-magnetic-field, and so on”.
“These results will not be welcome news as there are many with short-term vested interests that will want to ignore them. It is not that we want to stop the modern world but rather make it safer.
Essentially, it is time for us to wake up and realise that a major problem we now face is unprecedented levels of neurological disease, not just the earlier dementias and thinking of the USA - `when America sneezes, Europe gets cold a decade later`.”
How stress can tweak the brain to sabotage self-control
A challenging morning meeting or an interaction with an upset client at work may affect whether we go for that extra chocolate bar at lunch. In a study appearing August 5 in Neuron . researchers placed human volunteers in a similar food choice scenario to explore how stress can alter the brain to impair self-control when we’re confronted with a choice.
“Our findings provide an important step towards understanding the interactions between stress and self-control in the human brain, with the effects of stress operating through multiple neural pathways,” says lead author Silvia Maier, of the University of Zurich’s Laboratory for Social and Neural Systems Research. “Self-control abilities are sensitive to perturbations at several points within this network, and optimal self-control requires a precise balance of input from multiple brain regions rather than a simple on/off switch.” She emphasized that much work still remains, however, to fully understand the mechanisms involved.
In the study, 29 participants underwent a treatment known to induce moderate stress in the laboratory before they were asked to choose between two food options. An additional 22 participants did not undergo the treatment, which involved being observed and evaluated by the experimenter while immersing a hand in an ice water bath for 3 minutes, before choosing between the food options.
All of the participants who were selected for the study were making an effort to maintain a healthy lifestyle, so the study presented them with a conflict between eating a very tasty but unhealthy item and one that is healthy but less tasty.
The scientists found that when individuals chose between different food options after having experienced the stressful ice bath treatment, they overweighed food taste attributes and were more likely to choose an unhealthy food compared with people who were not stressed.
The effects of stress were also visible in the brain. Stressed participants’ brains exhibited altered patterns of connectivity between regions including the amygdala, striatum, and the dorsolateral and ventromedial prefrontal cortex, essentially reducing individuals’ ability to exercise self-control over food choices. Only some of these changes were associated with cortisol, a hormone commonly linked to stress.
The investigators say that their study indicates that even moderate levels of stress can impair self-control. “This is important because moderate stressors are more common than extreme events and will thus influence self-control choices more frequently and for a larger portion of the population,” says senior author Todd Hare. “One interesting avenue for future research will be to determine whether some of the factors shown to protect against structural brain changes following severe stress–such as exercise and social support–can also buffer the effects of moderate stress on decision making,” he adds.
There was also a good deal of variation in the degree to which stress affected individuals in the study, so it will be important to investigate why some people are more resilient than others.
Could Body Posture During Sleep Affect How Your Brain Clears Waste?
Sleeping in the lateral, or side position, as compared to sleeping on one’s back or stomach, may more effectively remove brain waste and prove to be an important practice to help reduce the chances of developing Alzheimer’s, Parkinson’s and other neurological diseases, according to researchers at Stony Brook University.
By using dynamic contrast magnetic resonance imaging (MRI) to image the brain’s glymphatic pathway, a complex system that clears wastes and other harmful chemical solutes from the brain, Stony Brook University researchers Hedok Lee, PhD, Helene Benveniste, MD, PhD, and colleagues, discovered that a lateral sleeping position is the best position to most efficiently remove waste from the brain. In humans and many animals the lateral sleeping position is the most common one. The buildup of brain waste chemicals may contribute to the development of Alzheimer’s disease and other neurological conditions. Their finding is published in the Journal of Neuroscience .
Dr. Benveniste, Principal Investigator and a Professor in the Departments of Anesthesiology and Radiology at Stony Brook University School of Medicine, has used dynamic contrast MRI for several years to examine the glymphatic pathway in rodent models. The method enables researchers to identify and define the glymphatic pathway, where cerebrospinal fluid (CSF) filters through the brain and exchanges with interstitial fluid (ISF) to clear waste, similar to the way the body’s lymphatic system clears waste from organs. It is during sleep that the glymphatic pathway is most efficient. Brain waste includes amyloid β (amyloid) and tau proteins, chemicals that negatively affect brain processes if they build up.
In the paper, “The Effect of Body Posture on Brain Glymphatic Transport,” Dr. Benveniste and colleagues used a dynamic contrast MRI method along with kinetic modeling to quantify the CSF-ISF exchange rates in anesthetized rodents’ brains in three positions – lateral (side), prone (down), and supine (up).
“The analysis showed us consistently that glymphatic transport was most efficient in the lateral position when compared to the supine or prone positions,” said Dr. Benveniste. “Because of this finding, we propose that the body posture and sleep quality should be considered when standardizing future diagnostic imaging procedures to assess CSF-ISF transport in humans and therefore the assessment of the clearance of damaging brain proteins that may contribute to or cause brain diseases.”
Dr. Benveniste and first-author Dr. Hedok Lee, Assistant Professor in the Departments of Anesthesiology and Radiology at Stony Brook developed the safe posture positions for the experiments. Their colleagues at the University of Rochester, including Lulu Xie, Rashid Deane and Maiken Nedergaard, PhD, used fluorescence microscopy and radioactive tracers to validate the MRI data and to assess the influence of body posture on the clearance of amyloid from the brains.
“It is interesting that the lateral sleep position is already the most popular in human and most animals – even in the wild – and it appears that we have adapted the lateral sleep position to most efficiently clear our brain of the metabolic waste products that built up while we are awake,” says Dr. Nedergaard. “The study therefore adds further support to the concept that sleep subserves a distinct biological function of sleep and that is to ‘clean up’ the mess that accumulates while we are awake. Many types of dementia are linked to sleep disturbances, including difficulties in falling asleep. It is increasing acknowledged that these sleep disturbances may accelerate memory loss in Alzheimer’s disease. Our finding brings new insight into this topic by showing it is also important what position you sleep in,” she explained.
Dr. Benveniste cautioned that while the research team speculates that the human glymphatic pathway will clear brain waste most efficiency when sleeping in the lateral position as compared to other positions, testing with MRI or other imaging methods in humans are a necessary first step.
Brain infection starts in gut
The study reveals how the proteins - called prions - spread from the gut to the brain after a person or animal has eaten contaminated meat.
Scientists say their findings could aid the earlier diagnosis of prion diseases - which include variant Creutzfeldt-Jakob disease (vCJD) in people and bovine spongiform encephalopathy (BSE) in cows.
In people, the disease remains very rare - 229 people have died from vCJD since it was first identified almost 20 years ago, of which 177 were from the UK.
Whether all individuals with evidence of prion infection in their gut go on to develop neurological disease is not known. We need a greater understanding of what factors enhance our susceptibility to prion diseases so that we can put in place safeguards to prevent these conditions from spreading in people and farmed animals. -Professor Neil Mabbott (The Roslin Institute, University of Edinburgh)
Prions are infectious proteins with abnormal shapes that can be passed between people and animals by eating contaminated meat.
Until now, it was not known how prions spread from the gut to the brain after consuming infected meat.
Researchers at University of Edinburgh’s Roslin Institute studied the course of prion infection in mice.
They found that prions must first build up in specialised structures in the lining of the small intestine before they are able to spread throughout the body to the brain.
The structures - called Peyer’s patches - are part of the body’s immune system and form the first line of defence against contaminated food.
The study suggests prions hijack Peyer’s patches to cause infection.
Prions did not build up in similar patches in the large intestine until a later stage of infection, the team found.
At this stage, prions were also detected in the spleen and lymph nodes.
As many as one in 2000 people in the UK could be carrying infectious prions without showing any symptoms of disease, according to recent estimates.
These are based on analysis of tissue taken during routine appendix removal operations.
The researchers say that these estimates may fail to identify individuals in the earliest stages of infection, where prions have not yet spread beyond the small intestine.
When prions get into the brain, they destroy nerve cells. This can lead to major neurological symptoms including memory impairment, personality changes, and difficulties with movement.
Other prion diseases include scrapie in sheep and chronic wasting disease in deer.
High salt intake could be a risk factor for multiple sclerosis
Here’s another reason to put the salt shaker down: New research in mice shows that diets high in sodium may be a novel risk factor in the development of multiple sclerosis (MS) by influencing immune cells that cause the disease. Although this research does implicate salt intake as a risk factor, it is important to note that dietary salt is likely just one of the many environmental factors contributing to this complex disease, and very much influenced by one’s genetic background. This finding was published in the August 2015 issue of The FASEB Journal .
“We hope to provide a comprehensive understanding of how and why environmental factors interact with individuals’ unique genetic make up to influence autoimmune diseases such as MS,” said Dimitry N. Krementsov, Ph.D. a researcher involved in the work from the Department of Medicine, Immunobiology Program at the University of Vermont in Burlington, Vermont.
To make this discovery, Krementsov and colleagues fed a high salt diet or a control diet to three genetically different groups of mice. Researchers then induced a disease in these mice that mimics human MS. In one genetic group, both males and females fed a high salt diet showed worse clinical signs of the disease. In the other genetic group, only females showed a negative response to salt. In the third genetic group, there was no response to salt. Genetics were the critical factor. In the mice that did respond to salt, there were no direct changes in the function of their immune cells, but they showed signs of a weakened blood-brain barrier.
“As is the case with other things, you need to get enough salt so your body functions properly, but not too much or things start to go haywire,” said Gerald Weissmann, M.D. Editor-in-Chief of The FASEB Journal. “This report helps shed light on what can go wrong in individuals with genes that make one susceptible to autoimmune disease. It also helps us understand how much salt is just right for any given individual.”
The permanence of memories has long been thought to be mediated solely by the production of new proteins. However, new research from the University of Alberta has shown that the electrical activity of the brain may be a more primary factor in memory solidification.
“It’s not just protein synthesis, long the dominant biological model, but also ‘offline’ memory rehearsal in the brain that leads to memory solidification,” says Clayton Dickson, psychology professor at the U of A and one of the authors of the new study. “Although the protein synthesis idea is entrenched in the field, we and others have been closely examining the older data that supports this and have found some perplexing inconsistencies.” For this study, Dickson worked with his undergraduate psychology honours students Jonathan Dubue and Ty McKinney as well as his departmental colleague Dallas Treit, all at the U of A and all part of the Neuroscience and Mental Health Institute.
Online learning, offline rehearsal
“Learning is thought to occur ‘online’ by creating new or strengthened synaptic connections,” says Dickson. “However, we also know that the period directly following learning—when the brain is ‘offline’—is critical for solidifying that information.” Although agents that block protein synthesis can block future retrieval of this information at this stage, Dickson has long been convinced that this might be caused by disruption of electrical activity. He equates this stage to a mental rehearsal of the preceding events, activity patterns that likely help set the memory in the
The stage when a brain is actively engaged in a new experience can be described as “online” activity. On the flip side of this neurological process, “offline” activity, or neural replay, is the process by which the brain rehearses what has been learned in order to strengthen the most important memories. What Dickson and his collaborators have shown is that protein synthesis inhibitors disrupt activity and can also disrupt “online” processing as well.
Treating memory disorders
“Memory permanence is a critical element of our day-to-day lives,” notes Dickson. “Understanding how our brains solidify memories is essential for treating memory disorders and, in the case of post-traumatic stress disorder, for potentially ridding oneself of bothersome memories. The more we understand about the process, the more likely we can find a way for people to improve their good memories and eliminate the bad.”
Dickson’s lab at the U of A is one of only a handful worldwide that critically assess the role of protein synthesis inhibition in memory and synaptic plasticity. “We are interested in what kind of neural activity patterns (i.e. brain waves) might specifically be involved in memory consolidation. We are currently trying to directly manipulate these patterns by using simple electrical methods.”
In a paper recently published in the European Journal of Neuroscience . they demonstrated that the hippocampus (associated with memory) and the nucleus accumbens (associated with pleasure) work together in making critical decisions of this type, where time plays a role. The researchers showed that when these two structures were effectively ‘disconnected’ in the brain, there is a disruption of decisions related to delayed gratification.
It is a discovery which has implications not only for a range of neuropsychiatric disorders such as ADHD, eating disorders and anxiety disorders, but also for more common problems involving maladaptive daily decisions about drug or alcohol use, gambling or credit card binges.
How the work was done
The researchers discovered the importance of this connection by working with rats trained to make choices between stimuli that would result in their receiving different amounts of rewards, after varying periods of time. The rats were asked to choose between two identical visual shapes by pressing their nose against one of them on a touchscreen (similar to an iPad), in exchange for rewards in the form of sugar pellets. Like most humans, rats have a sweet tooth.
With time, rats learned to negotiate a trade-off between a small reward (1 sugar pellet) delivered immediately and a large reward (4 sugar pellets) delivered after a delay. The researchers discovered that the average rat, like the average human, is willing to wait a bit for a larger reward, but only for a certain period of time, and only if the reward is large enough.
However, following disruption of the circuit connecting the hippocampus and nucleus accumbens, the rats became impatient and unwilling to wait, even for a few seconds. They always selected the immediate reward despite its smaller size. Importantly, lesions to other parts of the brain, including the prefrontal cortex, known to be involved in certain aspects of decision-making, did not cause this behavioural change.
Implications and next steps
“This is a type of decision-making that many of us grapple with in daily life, particularly the very young, the very old, and those with brain disease,” said Prof. Yogita Chudasama, of McGill’s Psychology Department and the lead researcher on the paper. “In some ways this relationship makes sense; the hippocampus is thought to have a role in future planning, and the nucleus accumbens is a “reward” center and a major recipient of dopamine, a chemical responsible for transmitting signals related to pleasure and reward, but we couldn’t have imagined that the results would be so clear. In addition to providing a deeper understanding of decision-making, our results highlight the potential of this circuit, involving the hippocampus and nucleus accumbens, to be a therapeutic target in human patient groups.”
‘Brain training’ app may improve memory and daily functioning in schizophrenia
Schizophrenia is a long-term mental health condition that causes a range of psychological symptoms, ranging from changes in behaviour through to hallucinations and delusions. Psychotic symptoms are reasonably well treated by current medications; however, patients are still left with debilitating cognitive impairments, including in their memory, and so are frequently unable to return to university or work.
There are as yet no licensed pharmaceutical treatments to improve cognitive functions for people with schizophrenia. However, there is increasing evidence that computer-assisted training and rehabilitation can help people with schizophrenia overcome some of their symptoms, with better outcomes in daily functioning and their lives.
Schizophrenia is estimated to cost £13.1 billion per year in total in the UK, so even small improvements in cognitive functions could help patients make the transition to independent living and working and could therefore substantially reduce direct and indirect costs, besides improving the wellbeing and health of patients.
In a study published today in the Philosophical Transactions of the Royal Society B . a team of researchers led by Professor Barbara Sahakian from the Department of Psychiatry at Cambridge describe how they developed and tested Wizard, an iPad game aimed at improving an individual’s episodic memory. Episodic memory is the type of memory required when you have to remember where you parked your car in a multi-storey car park after going shopping for several hours or where you left your keys in home several hours ago, for example. It is one of the facets of cognitive functioning to be affected in patients with schizophrenia.
The game, Wizard, was the result of a nine-month collaboration between psychologists, neuroscientists, a professional game-developer and people with schizophrenia. It was intended to be fun, attention-grabbing, motivating and easy to understand, whilst at the same time improving the player’s episodic memory. The memory task was woven into a narrative in which the player was allowed to choose their own character and name; the game rewarded progress with additional in-game activities to provide the user with a sense of progression independent of the cognitive training process.
The researchers assigned twenty-two participants, who had been given a diagnosis of schizophrenia, to either the cognitive training group or a control group at random. Participants in the training group played the memory game for a total of eight hours over a four-week period; participants in the control group continued their treatment as usual. At the end of the four weeks, the researchers tested all participants’ episodic memory using the Cambridge Neuropsychological Test Automated Battery (CANTAB) PAL, as well as their level of enjoyment and motivation, and their score on the Global Assessment of Functioning (GAF) scale, which doctors use to rate the social, occupational, and psychological functioning of adults.
Professor Sahakian and colleagues found that the patients who had played the memory game made significantly fewer errors and needed significantly fewer attempts to remember the location of different patterns in the CANTAB PAL test relative to the control group. In addition, patients in the cognitive training group saw an increase in their score on the GAF scale.
Participants in the cognitive training group indicated that they enjoyed the game and were motivated to continue playing across the eight hours of cognitive training. In fact, the researchers found that those who were most motivated also performed best at the game. This is important, as lack of motivation is another common facet of schizophrenia.
Professor Sahakian says: “We need a way of treating the cognitive symptoms of schizophrenia, such as problems with episodic memory, but slow progress is being made towards developing a drug treatment. So this proof-of-concept study is important because it demonstrates that the memory game can help where drugs have so far failed. Because the game is interesting, even those patients with a general lack of motivation are spurred on to continue the training.”
Professor Peter Jones adds: “These are promising results and suggest that there may be the potential to use game apps to not only improve a patient’s episodic memory, but also their functioning in activities of daily living. We will need to carry out further studies with larger sample sizes to confirm the current findings, but we hope that, used in conjunction with medication and current psychological therapies, this could help people with schizophrenia minimise the impact of their illness on everyday life.”
It is not clear exactly how the apps also improved the patients’ daily functioning, but the researchers suggest it may be because improvements in memory had a direct impact on global functions or that the cognitive training may have had an indirect impact on functionality by improving general motivation and restoring self-esteem. Or indeed, both these explanations may have played a role in terms of the impact of training on functional outcome.
In April 2015, Professor Sahakian and colleagues began a collaboration with the team behind the popular brain training app Peak to produce scientifically-tested cognitive training modules. The collaboration has resulted in the launch today of the Cambridge University & Peak Advanced Training Plan a memory game, available within Peak’s iOS app, designed to train visual and episodic memory while promoting learning.
The training module is based on the Wizard memory game, developed by Professor Sahakian and colleague Tom Piercy at the Department of Psychiatry at the University of Cambridge. Rights to the Wizard game were licensed to Peak by Cambridge Enterprise, the University’s commercialisation company.
“This new app will allow the Wizard memory game to become widely available, inexpensively. State-of-the-art neuroscience at the University of Cambridge, combined with the innovative approach at Peak, will help bring the games industry to a new level and promote the benefits of cognitive enhancement,” says Professor Sahakian.
Positive reinforcement plays key role in cognitive task performance in ADHD kids
A little recognition for a job well done means a lot to children with Attention Deficit/Hyperactivity Disorder (ADHD) – more so than it would for typically developing kids.
That praise, or other possible reward, improves the performance of children with ADHD on certain cognitive tasks, but until a recent study led by researchers from the University at Buffalo, it wasn’t clear if that result was due to heightened motivation inspired by positive reinforcement or because those with ADHD simply had greater room for improvement at certain tasks relative to their peers without such a diagnosis.
“Our results suggest that the motivation piece is critical,” says Whitney Fosco, a graduate student in the Department of Psychology in the UB College of Arts and Sciences. “Kids with ADHD showed more improvement because they are more motivated by the opportunity to gain rewards, not because they simply did worse from the beginning.”
The findings come out of a novel study published in the journal Behavioral and Brain Functions that collectively examined two leading theories on ADHD, combining what previous work had mostly looked at separately.
One of those theories suggests that lower-than-average cognitive abilities contribute to symptoms associated with ADHD, such as inattentiveness. The other theory favors motivation over ability, focusing on whether kids with ADHD have an increased sensitivity to reward.
“When asking whether the performance difference we see is the result of ability or motivation, this research has more of an answer than any study that comes before it,” says UB psychologist Larry Hawk, the paper’s principle investigator.
The results of the research conducted by Hawk, Fosco, UB graduate student Michelle Bubnik and Keri Rosch of the Kennedy Krieger Institute in Baltimore, Maryland, have clinical parallels as well.
Behavioral therapy, which uses positive consequences to increase the likelihood of achieving certain behaviors, is among the leading psychosocial interventions for children with an ADHD diagnosis.
The authors point out that the benefits of reward are not specific to children with ADHD.
“The major difference is that typically developing kids usually perform well even when simply asked to do their best,” says Fosco. “But kids with ADHD typically need an external or an additional reinforcement to perform their best.”
It’s a tricky area of research area, according to Hawk, since some of the subjects are being tested on tasks on which they have a demonstrated history of poor performance.
There is also a degree of variability between the two groups. The authors say that having a diagnosis of ADHD doesn’t necessarily mean that a child will perform poorly on any given task, and neither does the absence of a diagnosis mean that the child will perform well on any given task.
“You can’t say kids with ADHD respond more to reinforcement because they were doing poorly to begin with,” says Hawk. “We showed that was not true. It was greater motivation to obtain external rewards that drove the effects we observed.”
Depressed females have over-active glutamate receptor gene
Numerous genes that regulate the activity of a neurotransmitter in the brain have been found to be abundant in brain tissue of depressed females. This could be an underlying cause of the higher incidence of attempted suicide among women, according to research at the University of Illinois at Chicago.
Studying postmortem tissue from brains of psychiatric patients, Monsheel Sodhi, assistant professor of pharmacy practice at UIC, noted that female patients with depression had abnormally high expression levels of many genes that regulate the glutamate system, which is widely distributed in the brain.
Glutamate is the major excitatory neurotransmitter in the brain. Schizophrenia, epilepsy, autism and Alzheimer’s disease have all been linked to abnormalities of the glutamate system.
Gender plays a role in depression and suicide, Sodhi said. Women are two to three times more likely to attempt suicide, but men are four times more likely to die by suicide. The risk of suicide is associated with changes in several neurotransmitter systems.
Sodhi and her colleagues were intrigued by recent studies that found that a low dose of the drug ketamine, which alters glutamate system activity, can rapidly eliminate depression in two-thirds of patients who do not respond to conventional antidepressants. Conventional antidepressants target the monoamine systems, which secrete the neurotransmitters dopamine, serotonin or norepinephrine.
In the new study, published in the journal Molecular Psychiatry . Sodhi and her coworkers analyzed brain tissue from people who had suffered from depression. Both females and males were compared to subjects who had never experienced psychiatric illness. Many of the depressed patients, she said, had died by suicide.
Females with depression, Sodhi discovered, had the highest levels of expression of several glutamate receptor genes, perhaps making them more prone to depression. In addition, three of these genes were found to be elevated in both male and female patients who had died by suicide.
“Our data indicate that females with major depression who are at high risk of suicide may have the greatest antidepressant benefit from drugs that act on the glutamate system, such as ketamine,” Sodhi said. The study also suggests new glutamate receptor targets for development of treatments for depression and identifies biochemical markers that could be used to assess suicide risk, she said.
More than 41,000 people die by suicide each year in the United States, according to the Centers for Disease Control and Prevention. It is the second-leading cause of death in people aged 15 to 34 years. Suicide claims a life every 14 minutes in the U.S. and the frequency is escalating. Over 90 percent of the people who take their lives suffer from mental illness, predominantly depression.
Only one-third of patients receiving conventional treatments achieve substantial remission of their depression, which may take several weeks or longer, Sodhi said. This time lag in response to treatment is a problem, she said, due to the high risk of suicide.