Brain Energy Boost Mechanism Discovered

Brain Energy Boost Mechanism Discovered

Summary: Researchers identified a key mechanism that detects when the brain needs an energy boost, involving astrocytes and the molecule adenosine. This discovery could lead to new therapies for maintaining brain health and longevity, particularly in combating cognitive decline and neurodegenerative diseases.

The study found that astrocytes monitor neuronal activity and activate energy supply pathways, ensuring efficient brain function. This breakthrough offers potential treatments for conditions like Alzheimer’s disease.

Key Facts:

> Astrocytes play a crucial role in supplying energy to neurons during high-demand activities.

The molecule adenosine is essential for activating astrocyte glucose metabolism.

Disruption of this energy boost mechanism impairs brain function, memory, and sleep.

Source: UCL

A key mechanism which detects when the brain needs an additional energy boost to support its activity has been identified in a study in mice and cells led by UCL scientists.

The scientists say their findings, published in Nature , could inform new therapies to maintain brain health and longevity, as other studies have found that brain energy metabolism can become impaired late in life and contribute to cognitive decline and the development of neurodegenerative disease. Adenosine activates astrocyte glucose metabolism and supply of energy to neurons to ensure that synaptic function (neurotransmitters passing communication signals between cells) continues apace under conditions of high energy demand or reduced energy supply. Credit: Neuroscience News Lead author Professor Alexander Gourine (UCL Neuroscience, Physiology & Pharmacology) said: “Our brains are made up of billions of nerve cells, which work together coordinating numerous functions and performing complex tasks like control of movement, learning and forming memories. All of this computation is very energy-demanding and requires an uninterrupted supply of nutrients and oxygen.

“When our brain is more active, such as when we’re performing a mentally taxing task, our brain needs an immediate boost of energy, but the exact mechanisms that ensure on-demand local supply of metabolic energy to active brain regions are not fully understood.”

Prior research has shown that numerous brain cells called astrocytes appear to play a role in providing the brain neurons with energy they need. Astrocytes, shaped like stars, are a type of glial cell, which are non-neuronal cells found in the central nervous system.

When neighbouring neurons need an increase in energy supply, astrocytes jump into action by rapidly activating their own glucose stores and metabolism, leading to the increased production and release of lactate. Lactate supplements the pool of energy that is readily available for use by neurons in the brain.

Professor Gourine explained: “In our study, we have figured out how exactly astrocytes are able to monitor the energy use by their neighbouring nerve cells, and kick-start this process that delivers additional chemical energy to busy brain regions.”

In a series of experiments using mouse models and cell samples, the researchers identified a set of specific receptors in astrocytes that can detect and monitor neuronal activity, and trigger a signalling pathway involving an essential molecule called adenosine.

The researchers found that the metabolic signalling pathway activated by adenosine in astrocytes is exactly the same as the pathway that recruits energy stores in the muscle and the liver, for example when we exercise.

Adenosine activates astrocyte glucose metabolism and supply of energy to neurons to ensure that synaptic function (neurotransmitters passing communication signals between cells) continues apace under conditions of high energy demand or reduced energy supply.

The researchers found that when they deactivated the key astrocyte receptors in mice, the animal’s brain activity was less effective, including significant impairments in global brain metabolism, memory and disruption of sleep, thus demonstrating that the signalling pathway they identified is vital for processes such as learning, memory and sleep.

First and co-corresponding author Dr Shefeeq Theparambil, who began the study at UCL before moving to Lancaster University, said: “Identification of this mechanism may have broader implications as it could be a way of treating brain diseases where brain energetics are downregulated, such as neurodegeneration and dementia.”

Professor Gourine added: “We know that brain energy homeostasis is progressively impaired in ageing and this process is accelerated during the development of neurodegenerative diseases such as Alzheimer’s disease.

“Our study identifies an attractive readily druggable target and therapeutic opportunity for brain energy rescue for the purpose of protecting brain function, maintaining cognitive health, and promoting brain longevity.”

Funding: The researchers were supported by Wellcome, and the study involved scientists at UCL, Lancaster University, Imperial College London, King’s College London, Queen Mary University of London, University of Bristol, University of Warwick, and University of Colorado. About this neuroscience research news

Author: Chris Lane
Source: UCL
Contact: Chris Lane – UCL
Image: The image is credited to Neuroscience News

Original Research: Open access.
“ Adenosine signalling to astrocytes coordinates brain metabolism and function ” by Alexander Gourine et al. Nature

Abstract Adenosine signalling to astrocytes coordinates brain metabolism and function Brain computation performed by billions of nerve cells relies on a sufficient and uninterrupted nutrient and oxygen supply.Astrocytes, the ubiquitous glial neighbours of neurons, govern brain glucose uptake and metabolism, but the exact mechanisms of metabolic coupling between neurons and astrocytes that ensure on-demand support of neuronal energy needs are not fully understood.Here we show, using experimental in vitro and in vivo animal models, that neuronal activity-dependent metabolic activation of astrocytes is mediated by neuromodulator adenosine acting on astrocytic A2B receptors. Stimulation of A2B receptors recruits the canonical cyclic adenosine 3′,5′-monophosphate–protein kinaseA signalling pathway, leading to rapid activation of astrocyte glucose metabolism and the release of lactate, which supplements the extracellular pool of readily available energy substrates.Experimental mouse models involving conditional deletion of the gene encoding A2B receptors in astrocytes showed that adenosine-mediated metabolic signalling is essential for maintaining synaptic function, especially under conditions of high energy demand or reduced energy supply.Knockdown of A2B receptor expression in astrocytes led to a major reprogramming of brain energy metabolism, prevented synaptic plasticity in the hippocampus, severely impaired recognition memory and disrupted sleep.These data identify the adenosine A2B receptor as an astrocytic sensor of neuronal activity and show that cAMP signalling in astrocytes tunes brain energy […]

Read more at neurosciencenews.com

Foods Overlooked in the US May Boost Kids’ Brain Power

Foods Overlooked in the US May Boost Kids' Brain Power

A naturally-occurring nutrient found in soybeans and other legumes may support children’s cognitive abilities and attention spans, new research suggests.

Read more: What Is a Health Savings Account?

Soybeans are rich in a nutrient called isoflavones, which are also found in chickpeas, peanuts and other legumes. Previous research has linked high consumption of isoflavones with a reduced risk of heart disease , stroke and even some types of cancer. Increasingly, studies have also linked isoflavone with improved cognitive function.

Now, researchers from the University of Illinois Urbana-Champaign have investigated the relationship between soy isoflavones and children’s attentional abilities using an EEG scanner to record electrical activity in the brain. Compounds found in soy products, like soy milk, tofu and edamame, may support brain processing in children. “No other studies have examined the association between soy isoflavones and attentional abilities using EEG or similar measures to record electrical activity generated by the brain,” Ajla Bristina, a neuroscience doctoral student at the University of Illinois Urbana-Champaign, said in a statement.

Newsletter

The Bulletin

Your Morning Starts Here

Begin your day with a curated outlook of top news around the world and why it matters.

I want to receive special offers and promotions from Newsweek

By clicking on SIGN ME UP, you agree to Newsweek’s Terms of Use & Privacy Policy . You may unsubscribe at any time.

Read more: Compare the Top Health Savings Account (HSA) Providers

The study analyzed data from 128 children aged 7 to 13 and investigated their general intellectual abilities as well as their information processing speed and attention.

Across all age groups, the children tended to consume low levels of isoflavone-containing soy foods. “Soy consumption for individual participants ranged from 0 to 35 mg/day,” Bristina said. “To put this into perspective, an 8 fl. oz serving of soy milk provides about 28 mg of isoflavones, a serving of tofu provides about 35 mg and half a cup of steamed edamame provides about 18 mg of isoflavones.”

However, those who did consume more soy foods showed faster response times during the attentional tasks and faster processing speeds, although there was no clear association between soy isoflavone intake and general intellectual ability.

“Our study adds evidence of the importance of nutrients found in soy foods for childhood cognition,” Bristina said. “[However,] soy foods are often not a regular part of children’s diets in the United States.”

Of course, this study is purely observational and more work needs to be done to understand the mechanisms behind these associations. “Correlational studies like this are only the first step,” Bristina said. “To better understand the effects of eating soy foods on children’s cognitive abilities and the precise amount of isoflavone intake necessary to elicit faster response times will require intervention approaches.”

Some people are allergic to soy, and for many others eating too much can cause digestive problems so it is important to consume it in moderation. That being said, if you are looking to add more soy into your diet, Bristina recommends starting with snacks like roasted edamame, soynuts and soymilk, as well as tofu, tempeh or soy-based nuggets.

Bristina will present the research at the annual meeting of the American Society for Nutrition, NUTRITION 2024, in Chicago on July 2.

Is there a health problem that’s worrying you? Do you have a question about soy? Let us know via health@newsweek.com. We can ask experts for advice, and your story could be featured in Newsweek .

Read more at www.newsweek.com

Serotonin 2C Receptor Key to Memory

Serotonin 2C Receptor Key to Memory

Summary: Researchers discovered the serotonin 2C receptor in the brain plays a crucial role in regulating memory in both humans and animal models. This breakthrough offers insights into conditions associated with memory loss, such as Alzheimer’s disease, and suggests new treatment pathways.

Mutations in the serotonin 2C receptor gene lead to memory deficits, but serotonin analogs like lorcaserin could improve memory by activating these receptors. This finding opens the door to potential new therapies for Alzheimer’s and other memory-related disorders.

Key Facts :

> Serotonin 2C receptors are vital for memory consolidation.

Mutations in the serotonin 2C receptor gene lead to memory deficits.

Serotonin analogs could improve memory in Alzheimer’s models.

Source: Baylor College of Medicine

Researchers at Baylor College of Medicine, the University of Cambridge in the U.K. and collaborating institutions have shown that serotonin 2C receptor in the brain regulates memory in people and animal models.

The findings, published in the journal Science Advances, not only provide new insights into the factors involved in healthy memory but also in conditions associated with memory loss, like Alzheimer’s disease, and suggest novel avenues for treatment. These neurons project to the vCA1 region, which has abundant serotonin 2C receptors. Credit: Neuroscience News “Serotonin, a compound produced by neurons in the midbrain, acts as a neurotransmitter, passing messages between brain cells,” said co-corresponding author Dr. Yong Xu, professor of pediatrics – nutrition and associate director for basic sciences at the USDA/ARS Children’s Nutrition Research Center at Baylor.

“Serotonin-producing neurons reach out to multiple brain regions including the hippocampus, a region essential for short- and long-term memory.”

Serotonin communicates messages to brain cells by binding to receptors on the cell surface, which signal the receiving cell to carry on a certain activity. In this study, the Xu lab, with expertise in basic and genetic animal studies, and the human genetics lab of co-corresponding author Dr. I. Sadaf Farooqi, professor of metabolism and medicine at the University of Cambridge, focused on serotonin 2C receptors, which are abundantly present in the brain’s ventral hippocampal CA1 region (vCA1), investigating the role of the receptor in memory in humans and animal models.

“We had previously identified five individuals carrying variants of the serotonin 2C receptor gene ( HTR2C ) that produce defective forms of the receptor,” Farooqi said.

“People with these rare variants showed significant deficits on memory questionnaires. These findings led us to investigate the association between HTR2C variants and memory deficits in animal models.”

The team genetically engineered mice to mimic the human mutation. When the researchers ran behavioral tests on these mice to evaluate their memory, they found that both males and females with the non-functional gene showed reduced memory recall when compared with the unmodified animals.

“When we combined the human data and the mouse data, we found compelling evidence connecting non-functional mutations of the serotonin receptor 2C with memory deficits in humans,” Xu said.

The animal models also enabled the team to dig deeper into how the receptor mediates memory. They discovered a brain circuit that begins in the midbrain where serotonin-producing neurons are located. These neurons project to the vCA1 region, which has abundant serotonin 2C receptors.

“When neurons in the midbrain reaching out to neurons in the vCA1 region release serotonin, the neurotransmitter binds to its receptor signaling these cells to make changes that help the brain consolidate memories,” Xu said.

Importantly, the researchers also found that this serotonin-associated neural circuit is damaged in a mouse model of Alzheimer’s disease.

“The neural circuit in the Alzheimer’s disease animal model cannot release sufficient serotonin into the vCA1 region that would need to bind to its receptor in the downstream neurons to signal the changes required to consolidate a memory,” Xu said.

However, it is possible to bypass this lack of serotonin and directly activate the downstream serotonin receptor by administering a serotonin analog, lorcaserin, a compound that selectively activates the serotonin 2C receptor in these cells.

“We tested this strategy in our animal model and were excited to find that the animals treated with the serotonin analog improved their memory,” Xu said.

“We hope our findings encourage further studies to evaluate the value of serotonin analogs in the treatment of Alzheimer’s disease.”

Other contributors to this work include Hesong Liu, Yang He, Hailan Liu, Bas Brouwers, Na Yin, Katherine Lawler, Julia M. Keogh, Elana Henning, Dong-Kee Lee, Meng Yu, Longlong Tu, Nan Zhang, Kristine M. Conde, Junying Han, Zili Yan, Nikolas A. Scarcelli, Lan Liao, Jianming Xu, Qingchun Tong, Hui Zheng, Zheng Sun, Yongjie Yang, Chunmei Wang and Yanlin He. The authors are affiliated with one of the following institutions: Baylor College of Medicine, Texas Children’s Hospital, University of Cambridge, University of Texas Health Science Center at Houston and Louisiana State University. About this memory and neuroscience research news

Author: Taylor Barnes
Source: Baylor College of Medicine
Contact: Taylor Barnes – Baylor College of Medicine
Image: The image is credited to Neuroscience News
Original Research: Open access. “ Neural circuits expressing the serotonin 2C receptor regulate memory in mice and humans ” by Yong Xu et al. Science Advances Abstract Neural circuits expressing the serotonin 2C receptor regulate memory in mice and humans Declined memory is a hallmark of Alzheimer’s disease (AD). Experiments in rodents and human postmortem studies suggest that serotonin (5-hydroxytryptamine, 5-HT) plays a role in memory, but the underlying mechanisms are unknown. Here, we investigate the role of 5-HT 2C receptor (5-HT 2C R) in regulating memory.Transgenic mice expressing a humanized HTR2C mutation exhibit impaired plasticity of hippocampal ventral CA1 (vCA1) neurons and reduced memory. Further, 5-HT neurons project to and synapse onto vCA1 neurons.Disruption of 5-HT synthesis in vCA1-projecting neurons or deletion of 5-HT 2C Rs in the vCA1 impairs neural plasticity and memory. We show that a selective 5-HT 2C R agonist, lorcaserin, improves synaptic plasticity and memory in an AD mouse model.Cumulatively, we demonstrate that hippocampal 5-HT 2C R signaling regulates memory, which may inform the use of 5-HT 2C R agonists in the treatment of dementia.Join our Newsletter I agree to have my personal […]

Read more at neurosciencenews.com

Supranormal Hearing Achieved by Boosting Ear Synapses

Supranormal Hearing Achieved by Boosting Ear Synapses

Summary: Researchers have enhanced auditory processing in young mice by increasing inner ear synapses using neurotrophin-3. This study supports the hypothesis that synapse density impacts hidden hearing loss in humans.

The findings could lead to new treatments for hearing disorders by preserving or regenerating synapses. The study reveals that boosting synapses not only improves hearing but also enhances auditory information processing.

Key Facts:

> Increasing inner ear synapses in mice led to improved auditory processing.

Synapse density is linked to hidden hearing loss, affecting hearing clarity in noise.

This research suggests potential new treatments for hearing disorders by targeting synapses.

Source: University of Michigan

A study from Michigan Medicine’s Kresge Hearing Research Institute was able to produce supranormal hearing in mice, while also supporting a hypothesis on the cause of hidden hearing loss in humans.

The researchers had previously used similar methods—increasing the amount of the neurotrophic factor neurotrophin-3 in the inner ear—to promote the recovery of auditory responses in mice that had experienced acoustic trauma, and to improve hearing in middle-aged mice. Not only did they show enhanced peaks in measured Acoustic Brain Stem response, but the mice also performed better on the Gap-Prepulse Inhibition test, suggesting an ability to process an increased amount of auditory information. Credit: Neuroscience News This study is the first to use the same approach in otherwise healthy young mice to create improved auditory processing, beyond what’s naturally occurring.

“We knew that providing Ntf3 to the inner ear in young mice increased the number of synapses between inner hair cells and auditory neurons, but we did not know what having more synapses would do to hearing,” said Gabriel Corfas, Ph.D., director of the Kresge Institute, who led the research team.

“We now show that animals with extra inner ear synapses have normal thresholds—what an audiologist would define as normal hearing—but they can process the auditory information in supranormal ways.”

The resulting paper, “From hidden hearing loss to supranormal auditory processing by neurotrophin 3-mediated modulation of inner hair cell synapse density,” was published in PLOS Biology . About the paper

As in previous studies, the researchers altered the expression of the Ntf3 to increase the number of synapses between inner hair cells and neurons.

Inner hair cells exist inside the cochlea and convert sound waves into signals sent—via those synapses—to the brain.

This time, however, two groups of young mice were created and studied: one in which synapses were reduced, and a second—the supranormal hearing mice—in which synapses were increased.

“Previously, we have used that same molecule to regenerate synapses lost due to noise exposure in young mice, and to improve hearing in middle-aged mice, when they already start showing signs of age-related hearing loss,” said Corfas.

“This suggests that this molecule has the potential to improve hearing in humans in similar situations. The new results indicate the regenerating synapses or increasing their numbers will improve their auditory processing.”

Both groups of mice underwent a Gap-Prepulse Inhibition test, which measures their ability to detect very brief auditory stimuli.

For this test, the subject is placed in a chamber with a background noise, then a loud tone that startles the mouse is presented alone or preceded by a very brief silent gap.

That gap, when detected by the mouse, reduces the startle response. Researchers then determine how long the silent gap needs to be for the mice to detect it.

Mice with fewer synapses required a much longer silent gap. That result supports a hypothesis about the relationship between synapse density and hidden hearing loss in humans.

Hidden hearing loss describes a difficulty in hearing that cannot be detected by standard testing.

People with hidden hearing loss may struggle to understand speech—or discern sounds in the presence of background noise. And results of the Gap-Prepulse Inhibition test had been previously shown to be correlated with auditory processing in humans. A surprising find

Less expected, however, were the results of the subjects with increased synapses.

Not only did they show enhanced peaks in measured Acoustic Brain Stem response, but the mice also performed better on the Gap-Prepulse Inhibition test, suggesting an ability to process an increased amount of auditory information.

“We were surprised to find that when we increased the number of synapses, the brain was able to process the extra auditory information. And those subjects performed better than the control mice in the behavioral test,” Corfas said.Hair cell loss had once been believed to be the primary cause of hearing loss in humans as we age.Now, however, it’s understood that the loss of inner hair cell synapses can be the first event in the hearing loss process, making therapies that preserve, regenerate and/or increase synapses exciting possible approaches for treating some hearing disorders.“Some neurodegenerative disorders also start with loss of synapses in the brain,” Corfas said.“Therefore, the lessons from the studies in the inner ear could help in finding new therapies for some of these devastating diseases.” About this auditory neuroscience research news Author: Sam Page Source: University of Michigan Contact: Sam Page – University of Michigan Image: The image is credited to Neuroscience News Original Research: Open access. “ From hidden hearing loss to supranormal auditory processing by neurotrophin 3-mediated modulation of inner hair cell synapse density ” by Gabriel Corfas et al. PLOS Biology Abstract From hidden hearing loss to supranormal auditory processing by neurotrophin 3-mediated modulation of inner hair cell synapse density Loss of synapses between spiral ganglion neurons and inner hair cells (IHC synaptopathy) leads to an auditory neuropathy called hidden hearing loss (HHL) characterized by normal auditory thresholds but reduced amplitude of sound-evoked auditory potentials. It has been proposed that synaptopathy and HHL result in poor performance in challenging hearing tasks despite a normal audiogram.However, this has only been tested in animals after exposure to noise or ototoxic drugs, which can cause deficits beyond synaptopathy. Furthermore, the impact of supernumerary synapses on auditory processing has not been evaluated.Here, we studied mice in which IHC synapse counts were increased or decreased by altering neurotrophin 3 (Ntf3) expression in IHC supporting cells.As […]

Read more at neurosciencenews.com

Neuroscience Says 1 Simple Habit Will Help You Build Brainpower and Emotional Intelligence. Here’s How to Do It

Neuroscience Says 1 Simple Habit Will Help You Build Brainpower and Emotional Intelligence. Here's How to Do It

The science is clear: Writing is good for your brain.

There are literally hundreds of studies that indicate the psychological benefits of writing. For example, Duke University researchers found that participants who engaged in a six-week program of expressive writing improved their resilience, symptoms of depression, and perceived stress.

In another study, researchers analyzed brain scans to see how writing about negative events such as failure affected the way subjects processed and dealt with such events. The researchers concluded that “expressive writing may be an effective tool to use to address negative emotions,” and that writing about a past failure could lead to improved learning.

And last year, brain researchers demonstrated that writing by hand rather than typing with a keyboard promoted more elaborate brain connectivity, which was crucial for memory formation, encoding new information, and learning.

All of this research indicates that the process of writing not only allows you to clarify thoughts, it helps you “internalize” them, making them a part of you and increasing the benefits associated with doing so. This helps you exercise your cognitive skills and develop emotional intelligence, the ability to understand and manage emotions.

But if you’re not in the habit of writing, what are some simple methods you can use to develop that habit? And how do you actually get better at it? Here are three tips to help you practice. (If you find value in this lesson, you might be interested in my free course, which teaches you how to build emotional intelligence in yourself and your team.) 1. Keep a journal.

Of all the things you can do to build emotional intelligence, journaling is probably the single most impactful.

Journaling can help you: Clarify your thinking

Connect the dots

Get to know yourself

Gain control over your emotions

Express yourself in a healthy way

Manage anxiety

Reduce stress

Improve your mood and your mental health

Yes, a journal can be a place to share your innermost thoughts and feelings–and that’s a great way to build self-awareness.

But if you tell me “I’m not one of those people,” remember this:

A journal can also be a place to keep random thoughts–anything from stream of consciousness, to new learnings, to that great idea you got in the shower.

So, use your journal however you want. The key is to make it easy (use a dedicated notebook and pen that’s feels good to use).

And although you can journal at any time, you may find it helpful to set aside a few minutes to practice at the same time every day (or at whatever cadence works for you). 2. Write to people you love.

For years, before I ever began writing professionally, my writing consisted primarily of thank-you notes, cards to friends, emails and letters to my dream girl and eventual wife. (Those letters and emails are actually the way I won her over–but that’s a story for another day.)

There are many advantages to writing friends and family:

You write with an audience in mind, which helps you build empathy.

You use an honest, “keep it real” writing style–which helps you get to know yourself and build rapport with others.

Most important, you share something of great value with loved ones. Think of how you felt the last time someone you cared about took time to write a heartfelt message to you.

You can even use this as an opportunity to save relationships that need mending, or just to get an extra smile out of someone. 3. Write appreciation notes at work. When Doug Conant took over as CEO at Campbell’s, he transformed the company culture from toxic to award-winning. He credits an interesting habit as a big part of his strategy:Writing thank-you notes.”Most cultures don’t do a good job of celebrating contributions,” Conant once said in an interview with Fast Company. “So I developed the practice of writing notes to our employees.”Over 10 years, it amounted to more than 30,000 notes, and we had only 20,000 employees. Wherever I’d go in the world, in employee cubicles you’d find my handwritten notes posted on their bulletin boards.”Today, almost all written communication is electronic. When you write short, sincere, specific notes of appreciation–at work or at home–you practice relationship management, a key element of emotional intelligence.In doing so, you develop a habit that helps you, helps others, and makes your workplace or home better.There you go. Three quick ways to get more practice writing, in a way that will help you exercise your brain and develop emotional intelligence:1. Keep a journal.2. Write loved ones.3. Write appreciation notes.The key is just to get started. Because the sooner you do, the sooner you’ll make writing a habit in your life. And the sooner you’ll use writing to make emotions work for you, instead of against you.

Read more at www.inc.com

Combining flavonoid with vitamin B6 may help preserve cognitive function

Combining flavonoid with vitamin B6 may help preserve cognitive function

Scientists are looking for ways to stop the degradation of vitamin B6, which may help improve brain health. Evgeniia Siiankovskaia/Getty Images A new study of mice shows that a naturally occurring flavonoid can slow down the degradation of vitamin B6 in the brain.

A deficiency in vitamin B6 has long been associated with poorer cognitive function.

Vitamin B6 supplementation alone to improve cognition has yielded mixed results in trials.

The study’s authors hope that greater cognitive benefits may be achieved by combining the flavonoid with B6 supplementation.

Insufficient vitamin B6 is linked to cognitive impairment, and a new study presents a novel approach to maintaining adequate B6 levels.

The study in mice finds that a naturally occurring flavonoid, 7,8-dihydroxyflavone , can directly bind to and inhibit a B6-degrading enzyme, thus helping to preserve levels of B6 in the brain.

The enzyme is pyridoxal phosphatase (PDXP).

The study follows previous work from the same team led by Antje Gohla, PhD, at the Institute of Pharmacology and Toxicology of the Universitet Würzburg in Germany. That work demonstrated improved spatial learning and memory capacity in mice when their pyridoxal phosphatase was de-activated.

The study is published in eLife .

Jacqueline Becker, PhD , neuropsychologist and health services researcher at the Icahn School of Medicine at Mount Sinai’s Division of General Internal Medicine, was not involved in the study.

“Several studies have examined the impact of B6 on cognition,” said Becker. “Specifically, that maintaining adequate levels of B vitamins, and vitamin B6 in particular, is essential for optimal neurotransmitter synthesis and homocysteine metabolism, and thus may have a direct impact on cognitive function.”

“Vitamin B6 deficiency has long been linked to cognitive impairment, particularly in areas that correlate with hippocampal functioning,” Becker said.

The hippocampus is believed to be important for age-dependent memory consolidation and learning and, therefore, cognition.

In the brain, said Becker, B6 “aids in the synthesis of neurotransmitters — e.g., serotonin, dopamine, gamma-aminobutyric acid — and helps reduce homocysteine levels in the blood.”

She noted as well that B6 is linked to mood, a known factor in cognitive health.

“Cognitive dysfunction is a cardinal symptom of depression, particularly deficits in attention and psychomotor speed,” she pointed out.

So far, the benefits of enhancing levels of B6 via supplementation as a therapeutic method are unclear. Clinical trials have yielded mixed results, “particularly in areas that correlate with hippocampal functioning,” according to Becker.

The new study may help explain that. Gohla said her team found that “PDXP is substantially upregulated — [or strengthened] — in the hippocampus of middle-aged compared to young mice.”

This aligns with age-related memory loss that occurs with aging.

Said Gohla, “This suggests that a therapeutic vitamin B6 supplementation alone may not be sufficient to elevate the levels of B6 in the brain — simply because the supplemented B6 would be immediately degraded by hyperactive PDXP.”

“In contrast,” the study finds, “combining B6 supplements with PDXP inhibitors that block B6 degradation may be much more effective in boosting cellular B6 levels.”

In the team’s previous work, they found the spatial learning and memory capacity of mice was improved when PDXP was genetically switched off. Their performance was compared to mice with PDXP intact.

The researchers assessed the cognitive functioning of mice using a Barnes maze that provides a means to measure “hippocampus-dependent spatial reference memory by evaluating the ability to learn and remember the location of a hidden escape zone using a set of visual cues,” said Gohla.

In the “maze,” mice are placed on a platform with unpleasantly bright lighting. While there were a number of possible “escape” holes for the mice on the platform, only some were available for use.

“The [PDXP-less] mice learn to locate the correct escape hole with the help of visual cues, such as colored shapes or patterns, that are placed around the platform,” said Gohla.

In the new study, the subject mice were sacrificed, and the researchers used small-molecule screening, protein crystallography, and biolayer interferometry to observe 7,8-dihydroxyflavone directly affecting the action of pyridoxal phosphatase.Given the differences between mice and humans, there may be concerns that the study’s findings will not be applicable to people.Becker said, however, that, “the two functions of B6 in cognition mentioned — i.e., neurotransmitter synthesis and homocysteine metabolism — are thought to be mechanistically interchangeable between mice and humans.”“So it is conceivable that [the research] would translate barring obvious environmental confounders (e.g., alcohol consumption, poor diet, etc.),” Becker said.“We expect,” said Gohla, “that 7,8-dihydroxyflavone will inhibit PDXP in the brain and, together with supplemented B6, increase cellular B6 levels. If and how this may then increase cognition is an unresolved question that we will address in future work.” She pointed out the complexities involved, saying, “there are many B6-dependent enzymes in the brain, including those that regulate neurotransmitter levels and neuronal signaling.” Among the things that are not known are whether a single enzyme or transmitter is a vital element, or whether multiple such factors are involved.“More studies,” Becker said, “are needed to determine the actual role of B6 supplementation in neurodegenerative diseases, as well as its bioavailability in synthetic (vs. dietary) forms and the appropriate doses.”Becker hypothesized that the therapeutic potential of B6 management will need to be evaluated on an individual basis.She suggested that the most benefit would likely occur when it is “combined with appropriate diet/nutrition and lifestyle that is optimized to support cognition and mental health — for example, considering other B vitamins (e.g., B12) and other micronutrients that are critical for brain health (e.g., folic acid , etc.).”

Read more at www.medicalnewstoday.com

Ozempic Boost Fullness Pre-Meal via Hypothalamus

Ozempic Boost Fullness Pre-Meal via Hypothalamus

Summary: A new study shows GLP-1 receptor agonists increase pre-meal fullness by activating neurons in the dorsomedial hypothalamus. This mechanism helps prevent overeating, offering insights into obesity treatment.

The research highlights how GLP-1 impacts food perception and hypothalamic responses to food cues, enhancing satiation before food intake.

Key Facts :

> GLP-1 receptor agonists activate neurons in the dorsomedial hypothalamus to promote fullness.

The study involved clinical trials with obese individuals, showing increased satiation indices with GLP-1RA treatment.

Optogenetic manipulation confirmed the role of hypothalamic neurons in encoding preingestive satiation.

Source: AAAS

GLP-1 receptor agonists promote the feeling of fullness before eating via neurons in the dorsomedial hypothalamus, according to a new study.

The findings offer new insights into the neural pathways by which GLP-1 receptor agonists increase the feeling of fullness to prevent overconsumption of food, which is key in mitigating obesity.

Glucagon-like-peptide-1 (GLP-1) plays an important role in signaling the feeling of fullness after eating. Preingestive satiation is a phenomenon that occurs before actual food intake, allowing animals to regulate internal status and prepare for changes. These findings suggest that GLP-1RAs may play a role in preingestive satiation to control food intake. Credit: Neuroscience News Recently, GLP-1 receptor agonists (GLP-1RAs) have proven effective in treating obesity by affecting food cognition, diminishing hypothalamic responses to food cues, and altering food palatability perception.

These findings suggest that GLP-1RAs may play a role in preingestive satiation to control food intake. However, the central mechanisms underlying these effects are poorly understood, and the targets of GLP-1RAs remain controversial.

Here, Kyu Sik Kim and colleagues present the results of a phase-specific clinical trial involving obese individuals.

Kim et al. conducted satiation surveys at baseline, pre-ingestive, and ingestive phases, with or without GLP-1RA treatment.

The results showed that GLP-1RA treatment consistently increased the satiation index (overall feeling of fullness) across all phases, while the control group showed a decline from baseline to pre-ingestive phase.

In the pre-ingestive phase, GLP-1RA significantly increased the satiation index compared to baseline, enhancing prospective food ingestion, food reward, and motivation satiation indices.

Through analysis of human and mouse brain samples, Kim et al. identified neural circuits in the dorsomedial hypothalamus that interact with these agonists to induce dampening of the desire for food.

Optogenetic manipulation of these neurons caused satiation and calcium imaging demonstrated their active involvement in encoding preingestive satiation. About this neuropharmacology and hunger research news

Author: Science Press Package Team
Source: AAAS
Contact: Science Press Package Team – AAAS
Image: The image is credited to Neuroscience News

Original Research: Closed access.
“ GLP-1 increases pre-ingestive satiation via hypothalamic circuits in mice and humans ” by Kevin W. Williams et al. Science

Abstract

GLP-1 increases pre-ingestive satiation via hypothalamic circuits in mice and humans

GLP-1 receptor agonists (GLP-1RAs) are effective anti-obesity drugs. However, the precise central mechanisms of GLP-1RAs remain elusive. We administered GLP-1RAs to obese patients and observed heightened sense of preingestive satiation.

Analysis of human and mouse brain samples pinpointed GLP-1R neurons in the dorsomedial hypothalamus (DMH) as candidates for encoding preingestive satiation. Optogenetic manipulation of DMH GLP-1R neurons caused satiation.

Calcium imaging demonstrated that these neurons are actively involved in encoding preingestive satiation. GLP-1RA administration increased the activity of DMH GLP-1R neurons selectively during eating behavior. We further identified an intricate interplay between DMH GLP-1R neurons and arcuate NPY/AgRP neurons (ARC NPY/AgRP ), to regulate food intake.Our findings reveal a hypothalamic mechanism through which GLP-1RAs control preingestive satiation, offering novel neural targets for obesity and metabolic diseases.Join our Newsletter I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.

Read more at neurosciencenews.com

CBD use during pregnancy produces strange behavior in offspring

CBD use during pregnancy produces strange behavior in offspring

CBD use during pregnancy might not be safeDepositphotos View gallery – 3 images While CBD or cannabidiol is now widely available, widely used and generally considered safe, new research has found that its use during pregnancy may produce some strange behavior in offspring and changes to the way their brains process sensory information.

Because it doesn’t produce perception-altering effects like THC does, cannabidiol (CBD), one of the active ingredients in cannabis, is deemed to be safe. Indeed, it’s been shown to be an effective treatment for a wide variety of health issues, from reducing epileptic seizures to treating inflammation .

Indicative of public belief about the safety of CBD, recently published research using data from the International Cannabis Policy Study found that one in five pregnant women (20.4%) in the US and Canada were using CBD-only products compared to 11.3% of non-pregnant women. Reasons for use included anxiety and depression, pain, headache, and morning sickness.

However, new research presented at the Federation of European Neuroscience Societies (FENS) Forum 2024 , currently being held in Vienna, suggests that using CBD during pregnancy may not be as safe as it’s considered to be. Studies have found that CBD, or cannabidiol, is effective for a range of health issues, but its use during pregnancy is not well-researched “Scientific evidence has proven that CBD crosses the placenta, can reach the brain of rodents and human embryos and is also present in breast milk; therefore, it’s a public health priority to understand the impact of CBD on the developing nervous system as we don’t yet know the consequences of CBD exposure to the brain during development,” said Alba Caceres Rodriguez, a PhD student at INSERM Aix-Marseille University, a French public research organization that focuses on human health. “An important part of the research we are conducting in mice is longitudinal follow-up of the behavioral consequences of gestational exposure to CBD, and we are also investigating what is happening to the neurons in the brain that may be the basis of such changes in behavioral traits.”

The researchers administered a low dose of CBD (3 mg/kg) to pregnant mice by an injection under the skin from days five to 18 of the gestational period, more than two-thirds of pregnancy. Injecting the CBD, rather than administering it orally, ensured that each mouse received the same concentration. Another group of pregnant mice did not receive CBD and acted as controls.

Pups born to both groups of mice were tested once they reached adulthood. The researchers placed the mice in a new environment, and their social interactions were monitored with Live Mouse Tracker , software that uses a depth-sensing camera and machine learning to analyze the behavior of groups of mice in real-time. The results suggested that CBD altered specific mice behaviors and was sex-dependent.

“We found a number of behavioral changes among the mice exposed to CBD,” Caceres Rodriguez said. “CBD-exposed females tended to move around their new environment more compared to females that didn’t receive CBD during gestation. Furthermore, compared to control mice, both male and female mice treated with CBD established more physical contacts with each other.”

Examining the mice’s brains, the researchers found changes in two parts of the insular cortex (IC), the brain’s ‘integration hub’: the anterior IC (aIC), responsible for processing emotional and social signals, and the posterior IC (pIC), which processes pain perceptions and the body’s physical and emotional state. The insular cortex (IC) of the human brain, highlighted in green “Our results reveal that prenatal exposure to CBD profoundly changes the functionality of neurons in the insular cortex,” Iezzi said. “We saw differences according to sex and also according to IC sub-regions. In particular, pyramidal neurons in the pIC lose their cellular identity following prenatal exposure to CBD and no longer behave like typical pIC neurons. This could have negative consequences on specific functions of the pIC. These neurons specialize in integrating sensory information from the environment and the internal state of the body in order to generate an appropriate behavioral response. Therefore, a loss of pIC differentiation following prenatal exposure to CBD can have a considerable impact on the ability to understand and react properly to the environment.”

The study’s findings advance our understanding of the effect of CBD use on the developing fetus and challenge the belief that it’s safe for pregnant women, the researchers say.

“These findings have significant implications for understanding the effects of CBD on fetal life, changing the general idea that CBD is a universally safe compound, and revealing the need for additional studies on the effect of prenatal CBD exposure,” said Iezzi. “Furthermore, several studies have shown that IC disfunction increases the risk of developing psychiatric disorders, including anxiety, addiction, depression and schizophrenia.”

One of the study’s limitations is that the mice were given a controlled dose of CBD, whereas human females may be more likely to use CBD intermittently throughout their pregnancy as needed to help with symptoms such as morning sickness. They might also take considerably larger doses than those given to the mice.

Nonetheless, the researchers say a strength was the study’s naturalistic approach.

“A strength of our study is that we are able to reproduce a more naturalistic environment, which permits us to study group dynamics that would be impossible to unmask with other conventional task-based tests,” said Caceres Rodriguez. “This study serves as a good starting point to dive deeper to understand the actual consequences of these changes in overall social interactions in the long term.”

This research has only been presented at the FENS Forum 2024 and hasn’t been published or peer-reviewed yet.

Source: INSERM Aix-Marseille Université via Scimex

View gallery – 3 images

Read more at newatlas.com

Next platform for brain-inspired computing

Computers have come so far in terms of their power and potential, rivaling and even eclipsing human brains in their ability to store and crunch data, make predictions and communicate. But there is one domain where human brains continue to dominate: energy efficiency.

“The most efficient computers are still approximately four orders of magnitude — that’s 10,000 times — higher in energy requirements compared to the human brain for specific tasks such as image processing and recognition, although they outperform the brain in tasks like mathematical calculations,” said UC Santa Barbara electrical and computer engineering Professor Kaustav Banerjee, a world expert in the realm of nanoelectronics. “Making computers more energy efficient is crucial because the worldwide energy consumption by on-chip electronics stands at #4 in the global rankings of nation-wise energy consumption, and it is increasing exponentially each year, fueled by applications such as artificial intelligence.” Additionally, he said, the problem of energy inefficient computing is particularly pressing in the context of global warming, “highlighting the urgent need to develop more energy-efficient computing technologies.”

Neuromorphic (NM) computing has emerged as a promising way to bridge the energy efficiency gap. By mimicking the structure and operations of the human brain, where processing occurs in parallel across an array of low power-consuming neurons, it may be possible to approach brain-like energy efficiency. In a paper published in thejournal Nature Communications , Banerjee and co-workers Arnab Pal, Zichun Chai, Junkai Jiang and Wei Cao, in collaboration with researchers Vivek De and Mike Davies from Intel Labs propose such an ultra-energy efficient platform, using 2D transition metal dichalcogenide (TMD)-based tunnel-field-effect transistors (TFETs). Their platform, the researchers say, can bring the energy requirements to within two orders of magnitude (about 100 times) with respect to the human brain.

Leakage currents and subthreshold swing

The concept of neuromorphic computing has been around for decades, though the research around it has intensified only relatively recently. Advances in circuitry that enable smaller, denser arrays of transistors, and therefore more processing and functionality for less power consumption are just scratching the surface of what can be done to enable brain-inspired computing. Add to that an appetite generated by its many potential applications, such as AI and the Internet-of-Things, and it’s clear that expanding the options for a hardware platform for neuromorphic computing must be addressed in order to move forward.

Enter the team’s 2D tunnel-transistors. Emerging out of Banerjee’s longstandingresearch efforts to develop high-performance, low-power consumption transistors to meet the growing hunger for processing without a matching increase in power requirement, these atomically thin, nanoscale transistors are responsive at low voltages, and as the foundation of the researchers’ NM platform, can mimic the highly energy efficient operations of the human brain. In addition to lower off-state currents, the 2D TFETs also have a low subthreshold swing (SS), a parameter that describes how effectively a transistor can switch from off to on. According to Banerjee, a lower SS means a lower operating voltage, and faster and more efficient switching.

“Neuromorphic computing architectures are designed to operate with very sparse firing circuits,” said lead author Arnab Pal, “meaning they mimic how neurons in the brain fire only when necessary.” In contrast to the more conventional von Neumann architecture of today’s computers, in which data is processed sequentially, memory and processing components are separated and which continuously draw power throughout the entire operation, an event-driven system such as a NM computer fires up only when there is input to process, and memory and processing are distributed across an array of transistors. Companies like Intel and IBM have developed brain-inspired platforms, deploying billions of interconnected transistors and generating significant energy savings.

However, there’s still room for energy efficiency improvement, according to the researchers.

“In these systems, most of the energy is lost through leakage currents when the transistors are off, rather than during their active state,” Banerjee explained. A ubiquitous phenomenon in the world of electronics, leakage currents are small amounts of electricity that flow through a circuit even when it is in the off state (but still connected to power). According to the paper, current NM chips use traditional metal-oxide-semiconductor field-effect transistors (MOSFETs) which have a high on-state current, but also high off-state leakage. “Since the power efficiency of these chips is constrained by the off-state leakage, our approach — using tunneling transistors with much lower off-state current — can greatly improve power efficiency,” Banerjee said.

When integrated into a neuromorphic circuit, which emulates the firing and reset of neurons, the TFETs proved themselves more energy efficient than state-of-the-art MOSFETs, particularly the FinFETs (a MOSFET design that incorporates vertical “fins” as a way to provide better control of switching and leakage). TFETs are still in the experimental stage, however the performance and energy efficiency of neuromorphic circuits based on them makes them a promising candidate for the next generation of brain-inspired computing.

According to co-authors Vivek De (Intel Fellow) and Mike Davies (Director of Intel’s Neuromorphic Computing Lab), “Once realized, this platform can bring the energy consumption in chips to within two orders of magnitude with respect to the human brain — not accounting for the interface circuitry and memory storage elements. This represents a significant improvement from what is achievable today.”

Eventually, one can realize three-dimensional versions of these 2D-TFET based neuromorphic circuits to provide even closer emulation of the human brain, added Banerjee, widely recognized as one of the key visionaries behind 3D integrated circuits that are now witnessing wide scale commercial proliferation.

Read more at www.sciencedaily.com

Scientists Identify Key Pathway for Brain Health Boost via Ketogenic Diet

Scientists Identify Key Pathway for Brain Health Boost via Ketogenic Diet

The ketogenic diet improves memory in aging mice by enhancing synapse function through a newly identified molecular signaling pathway involving ketone bodies, particularly β-hydroxybutyrate (BHB). Future research seeks to replicate these benefits without the diet by targeting specific pathways. Understanding these mechanisms offers new targets for enhancing memory, potentially without the need for a ketogenic diet.

The ketogenic diet has both enthusiasts and critics among dieters, but it undeniably has a scientifically documented impact on memory in mice. In their research, Buck Institute scientists and a team from the University of Chile discovered how this high-fat, low-carbohydrate diet enhances memory in older mice. They identified a novel molecular signaling pathway that improves synapse function, shedding light on the diet’s benefits for brain health and aging.

These findings, published in the June 5, 2024, issue of Cell Reports Medicine , offer new avenues for targeting memory effects at a molecular level without the need for a ketogenic diet or its byproducts.

“Our work indicates that the effects of the ketogenic diet benefit brain function broadly, and we provide a mechanism of action that offers a strategy for the maintenance and improvement of this function during aging,” said the study’s senior author, Christian González-Billault, PhD, who is a professor at the Universidad de Chile and director of their Geroscience Center for Brain Health and Metabolism, and adjunct professor at the Buck Institute. Collaboration and Previous Findings

“Building off our previous work showing that a ketogenic diet improves healthspan and memory in aging mice, this new work indicates that we can start with older animals and still improve the health of the aging brain, and that the changes begin to happen relatively quickly,” said John Newman, MD, PhD, whose laboratory at Buck collaborated with Dr. González-Billault on the study. Newman is both an assistant professor at the Buck Institute, and a geriatrician at University of California, San Francisco. “It is the most detailed study to date of the ketogenic diet and aging brain in mice.”

More than a century ago, researchers observed that rats that consumed less food lived longer. “We now know that being able to manipulate lifespan is not about specifically eating less,” said Newman, but actually is related to signals inside cells that turn on and off specific pathways in response to available nutrients. Many of those pathways are related to aging, such as controlling protein turnover and metabolism.

Some of those signals are the ketone bodies, which consist of acetoacetate (AcAc), β-hydroxybutyrate (BHB), and to a much lesser extent, acetone. These molecules are routinely produced in the liver. They ramp up when glucose is in short supply, whether due to caloric restriction, intense exercise, or low carbohydrate intake, such as with a ketogenic diet. Previous Studies on Ketogenic Diet and Longevity

Seven years ago, Newman led a team that published the first proof of the concept that if a ketogenic diet exposes mice to increased levels of ketone bodies over much of their adult life, it helps them to live longer and age in a more healthy way. “The most striking effect on their health as they aged was that their memory was preserved; it was possibly even better than when they were younger,” he said.

The current study, designed to answer what part of the ketogenic diet was having the effect and how it was affecting the brain on a molecular level to improve memory, was led by González-Billault in collaboration with scientists at the Buck. Mice on a ketogenic diet were fed a ratio of 90 percent calories from fat and 10 percent from protein, while mice on a control diet received the same amount of protein but only 13 percent fat. The test mice, of “advanced age” of more than two years old, received one week of the ketogenic diet, cycled with one week of the control diet, to keep the mice from overeating and becoming obese. Neurophysiological and Behavioral Experiments

The benefits of the ketogenic diet, said, González-Billault, were demonstrated through neurophysiological and behavioral experiments with the mice that test how well the mechanisms involved in memory generation, storage, and retrieval function in aged animals. When these showed that the ketogenic diet appeared to benefit how well the synapses responsible for memory worked, they took a deep dive into the protein composition at these synapses in the hippocampus, in collaboration with Buck professor Birgit Schilling, PhD, who directs the Proteomics and Mass Spectrometry Center.

“Surprisingly, we saw that the ketogenic diet caused dramatic changes in the proteins of the synapse,” said Schilling. Even more surprising, she said, was that the changes started after a relatively brief exposure to the diet (tested after only one week on the diet) and only became more pronounced over time (tested again after six weeks and a year).

Further testing indicated that in synapses, a particular signaling pathway (protein kinase A, which is critical to synapse activity) was activated by the ketogenic diet. In isolated cells, the team then showed that it appears that BHB, the main ketone body produced in a ketogenic diet, is activating this pathway. This leads to the idea, said González-Billault, that ketone bodies (specifically BHB) play a crucial role not only as an energy source, but also as a signaling molecule.

“BHB is almost certainly not the only molecule in play, but we think this is an important part of understanding how the ketogenic diet and ketone bodies work,” said Newman “This is the first study to really connect deep molecular mechanisms of ketone bodies all the way through to improving the aging brain.”

Looking forward, he said, the next step would be to see if the same memory protection could be achieved by using BHB alone, or possibly going even more targeted than that by manipulating the protein kinase A signaling pathway directly. “If we could recreate some of the big-picture effects on synapse function and memory just by manipulating that signaling pathway in the right cells,” he said, “we wouldn’t even need to eat a ketogenic diet in the end.”

Reference: “Ketogenic […]

Read more at scitechdaily.com

Science is just starting to understand the benefits of athletes putting their brains in ‘auto pilot’

Science is just starting to understand the benefits of athletes putting their brains in ‘auto pilot’

Brazil star Neymar dribbles a ball towards goal. Ricardo Rimoli/EPA Michael John O’Keeffe does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment. Partners

University of Queensland provides funding as a member of The Conversation AU.

View all partners Email X (Twitter) Facebook LinkedIn Print In a much-publicised 2014 study , two Japanese neuroscience researchers found Brazilian soccer star Neymar used 90% less of his brain capacity – measured by neuron signals – compared to a group of Spanish second division players.

In simple terms, it was as if his brain was on auto pilot compared to the lower-level athletes.

The study highlighted the old adage that “the majority of elite sport is played above the neck”, but its impact on sports coaching was constrained by a lack of understanding of how athletes’ brains worked.

More recent research under the umbrella label of “predictive processing” is able to shed light on this phenomenon with practical implications for sports coaching. What is predictive processing?

Predictive processing breaks with the longstanding view of the brain as a computer that somehow sifts through all the sensory information it constantly receives, prioritises what is important and then decides on what action to take.

Instead, those who believe in predictive processing view the brain as constantly predicting its input based on best-bet estimations via its predictive model. It’s a case of “we see what we believe” rather than “we believe what we see”.

Of course, being a prediction, there will be some errors or surprises. Critically, it is only these errors or surprises which are processed by the brain, leading to action. A practical example of predictive processing

Consider the following scenario.

After driving home from work listening to talkback radio, it struck me that I had no memory of the 20-minute trip.

Why? The predictive processing view of the brain has a simple and powerful explanation.

During the trip, which was uneventful, my predictive model of traffic on that specific route meshed with the sensory information during the drive. There were no surprises, such as an accident or detour, that required further processing.

In the event of an accident or a detour, further processing would have been generated, triggering increased attention.

In contrast, my grandson Hugo, who has had his “P” licence for a week, is still developing his traffic predictive model and has to process significantly more information in the model building process.

He is like a second-division soccer player compared to my Neymar when it comes to driving.

My predictive model worked because it recognised the likely predictive models of other road users. But if I saw a young man on a motorised scooter without a helmet, it would trigger attention as I’d be less sure of his traffic predictive model and likely couldn’t anticipate his behaviour.

When things go as expected, our brains don’t have to work as hard. But when surprises happen, our brains have to work harder. Predictive processing in sports

The traffic analogy can be extended to team sports like Australian rules football.

A defender will have a different predictive model compared to the forwards. Teams do not have to have identical shared predictive models, but rather compatible models so players can anticipate the actions of teammates.

According to predictive processing, learning is the process of building and refining one’s predictive model over time. Prediction errors or surprises are the learning trigger .

Predictive processing also distinguishes two types of learning.

The first is finetuning one’s predictive model, such as an athlete learning their coach’s strategy and skill execution.

The second is when a current predictive model is no longer appropriate because the environment has changed and a new predictive model is required.

Take the case of an Irish footballer who moves to Australia to play AFL , picking up an Australian rules football for the first time. Their Gaelic football predictive model is no longer appropriate given the introduction of tackling and kicking with an oval-shaped, not a round, ball.

Irish players have to not only learn new skills, such as tackling, but also how to play their role in the team game plan.In the driving analogy, they have to learn to drive a different vehicle in traffic. And they need to understand and anticipate the behaviour of teammates and the opposition. The element of surprise at training From a coaching perspective, athletes must be challenged by surprises to improve their predictive processing.A coach should therefore design training sessions that provide unexpected variations from established patterns.Rather than striving for perfectly executed training drills, a coach should create disruptions and surprises that force athletes to learn new and hopefully creative patterns of behaviour.As the athlete’s predictive model improves, they require fewer cognitive resources, which can result in improved anticipation and ability to withstand pressure .As a community youth soccer coach, I’ve experimented with a number of innovations with the aim of creating surprises.For instance, we played a seven-on-seven small-sided game (on a smaller pitch) where players were limited to one touch only.A number of players complained they were not skilled enough, but with encouragement, were prepared to give it a go.I had no idea how the game would work. Interestingly, the players were more conscious of the game situation and how they would pass the ball after receiving it – they didn’t have the time to receive and control the ball before making a decision.Other surprise innovations can include playing a small-sided game with a different type of ball, giving players a card that specifies the number of touches they are allowed and whether they have to pass forwards or backwards, or using headbands instead of bibs to distinguish between teams.Coaches can then discuss how the athletes adjusted, what they learned, and how to apply the learnings in a competitive game.Not many athletes will ever reach Neymar’s heights, but perhaps understanding and improving predictive processing can help them become the best versions of themselves.

Read more at theconversation.com

Signaling pathway in brain helps maintain balance in microglia, prevent cognitive deficit

Signaling pathway in brain helps maintain balance in microglia, prevent cognitive deficit

A 3D rendered confocal image shows a single TGF- β1 knockout microglia being activated while it is surrounded by normal microglia that do not have the gene deletion. Credit: Agnes (Yu) Luo A new study led by University of Cincinnati researchers sheds new light on the role of a signaling pathway in the brain to maintain health and prevent inflammation and cognitive deficits.

UC’s Agnes (Yu) Luo, Ph.D., is corresponding author on the research, published in the journal Nature Communications , and focused on a signaling pathway called TGF-β that plays a number of roles depending on where it is located in the body.

Luo explained that signaling pathways in the body control different cell functions and require two components: a type of molecule called a ligand and a receptor that the ligand binds to and activates to start the signaling.

Prior to this study, it was known that the TGF-β signaling pathway was important in brain immune cells called microglia in maintaining their balance, but its role in maintaining cognitive function was largely unknown. Additionally, the precise source of the TGF-β ligand in the brain was also unknown.

Luo said the researchers used state-of-the-art tools and found for the first time that microglia make the TGF-β ligand in the brain to prevent neuroinflammation.

“Microglia cells are the innate immune cells of the brain, and what surprised us most is that they each make their own TGF-β ligand,” said Luo, professor and vice chair in the Department of Molecular and Cellular Biosciences in UC’s College of Medicine. “This TGF-β ligand binds to the receptor on the microglia cell itself, and they use this signaling to stay in homeostasis. This self-produced ligand binds to receptors on the cell’s surface to keep each cell in a constantly balanced–and not in an inflamed–state.”

While it was previously known that TGF-β signaling helps keep microglia in balance, Luo said it was not known that microglia make the ligands themselves in a “spatially and precisely controlled” manner carried out by each individual cell, a mechanism called autocrine signaling.

“You can think of these microglia cells as being, in a way, ‘selfish,’ as they only make the ligand to keep themselves in balance and not inflamed,” said graduate student and study co-author Elliot Wegman. “This, thereby, provides a very precise mechanism to regulate local states of inflammation in the microenvironment of the brain.”

Using animal models, the team additionally found that when the TGF-β ligand is genetically deleted from microglia, it leads to global neuroinflammation in the brain.

“This suggests that the neuroinflammation in microglia is sufficient by itself without other causes to drive cognitive deficit,” Luo said. “We show the direct cause and link between these events.”

Moving forward, the team will investigate whether cognitive deficits can be slowed, stopped or potentially reversed by boosting the TGF-β ligand and signaling pathway in the brain under conditions where TGF-β signaling becomes compromised.

“We’re investigating whether restoring the TGF-β signaling pathway and revitalizing its signaling can then ameliorate disease-related or age-associated cognitive deficits,” she said. “The long-term goal of our research is to modify the brain environment to better support the survival of the neurons or promote repair of the brain after injury or damage.”

More information: Alicia Bedolla et al, Adult microglial TGFβ1 is required for microglia homeostasis via an autocrine mechanism to maintain cognitive function in mice, Nature Communications (2024). DOI: 10.1038/s41467-024-49596-0

Provided by University of Cincinnati

Read more at medicalxpress.com

Neuroscience Says This 2-Minute Brain Exercise Will Keep Your Mind Sharp and Focused All Day (and Reduce Your Stress)

Neuroscience Says This 2-Minute Brain Exercise Will Keep Your Mind Sharp and Focused All Day (and Reduce Your Stress)

The workplace is a magnet for stress and anxiety . You’ve probably come home after a bad day filled with frustration and annoyance from the day’s events, as I have. Whether it’s a missed deadline, a conflict of interest, an argument that divides the office, or an unhappy customer , stress is a natural part of work.

Stress can also have a negative impact on our bodies if you don’t manage it. The next morning, when you wake up, you might find yourself worrying about everything that happened the day before and everything you need to do to rectify it.

This is the wrong way to start your day. Start your day right

Did you know that we release the most stress hormones just minutes after waking up? According to research published in Life Sciences , this happens because thinking about the day ahead triggers our fight-or-flight instinct, causing cortisol to be released into our blood. Instead of caving to morning stress, here’s a better and simpler approach, backed by science: When you wake up, spend two or three minutes sitting in bed simply noticing your breath. As anxious thoughts about the day come up, let them go and refocus on your breathing.

That’s the technique many of us are already familiar with, but it’s worth refreshing. It’s mindfulness .

Rather than letting our minds race, experts advise us to take deep, slow breaths and feel the sensation of the air entering and leaving our lungs. Notice the sights and sounds around you–the birds chirping outside your window, the sky’s changing hues as the sun rises. Acknowledge any feelings of frustration and anger from the day before without judgment, and then gently bring your focus back to your breathing and surroundings.

By practicing mindfulness in this situation, scientists say you can reduce your morning stress and turn an otherwise irritating day into a calm presence. Raise your focus and awareness throughout the day

That’s a good starting point, but practicing mindfulness can increase your effectiveness as you head to the office and prepare for the day ahead. According to researchers, a mindful mind is defined by two essential skills: focus and awareness. Focus helps you to concentrate on what you’re doing in the moment, while awareness is recognizing and letting go of unnecessary distractions as they arise. Together, you can develop a sharp, clear mind on every task from the moment you step into the office.

By focusing on the task at hand instead of multitasking and recognizing internal and external distractions (like your notifications or chatty co-workers), mindfulness helps increase effectiveness, reduce mistakes, and enhance creativity. It really is a thing of beauty and will keep you sharp and productive all day long. Tips to stay mindful

To stay focused and present as the workday progresses and keep your brain from being hijacked, there are practical things you can do to maintain a mindful mind. For example:

1. Practice mindfulness when checking your inbox . Too many of us are addicted to email, which compromises our focus. Instead of falling for that trap, use mindfulness to prioritize what’s important and avoid checking email first thing in the morning to stay focused and creative.

2. Practice mindfulness to lead more effective meetings. Take two minutes to practice mindfulness before the meeting, even while walking to it. Consider starting the meeting with two minutes of silent time for everyone to arrive physically and mentally. Also, try ending the meeting five minutes earlier to allow for a mindful transition to the next meeting for all participants.

3. Practice mindfulness as the day wears on . By midafternoon, most of us are getting tired and losing focus. To avoid making poor decisions, experts suggest setting a timer to ring every hour after lunch and taking a one-minute mindfulness break to breathe, refocus, and avoid going on autopilot and making impulsive decisions.

4. Practice mindfulness at the end of your day. Take five to 10 minutes to turn off distractions (your car radio if you commute home) and focus on your breath. This will help you release stress and be fully present with your family when you get home.

Read more at www.inc.com

Resiliency shaped by activity in the gut microbiome and brain

Science News

from research organizations

FULL STORY

A new UCLA Health study has found that resilient people exhibit neural activity in the brain regions associated with improved cognition and regulating of emotions, and were more mindful and better at describing their feelings. The same group also exhibited gut microbiome activity linked to a healthy gut, with reduced inflammation and gut barrier.

For the study, rather than examine microbiome activity and composition linked to disease conditions — like anxiety and depression — the researchers wanted to flip the script and study the gut microbiome and brain in healthy, resilient people who effectively cope with different types of stress, including discrimination and social isolation.

“If we can identify what a healthy resilient brain and microbiome look like, then we can develop targeted interventions to those areas to reduce stress,” said Arpana Gupta, PhD, senior author and co-director of the UCLA Goodman-Luskin Microbiome Center. This is believed to be the first study to explore the intersection of resiliency, the brain, and the gut microbiome.

Gupta and her team focused on methods to cope with stress because research has shown that untreated stress can increase the risk of heart disease, stroke, obesity, and diabetes. While stress is an inevitable part of life, studying how to handle stress can help prevent developing diseases.

To conduct the study, published in Nature Mental Health, the researchers surveyed 116 people about their resiliency — like trust in one’s instincts and positive acceptance of change — and separated them into two groups. One group ranked high on the resiliency scale and the other group ranked low. The participants also underwent MRI imaging and gave stool samples two or three days before their scans.

The researchers found that people in the high resiliency group were less anxious and depressed, less prone to judge, and had activity in regions of the brain associated with emotional regulation and better cognition compared to the group with low resiliency. “When a stressor happens, often we go to this aroused fight or flight response, and this impairs the breaks in your brain,” Gupta said. “But the highly resilient individuals in the study were found to be better at regulating their emotions, less likely to catastrophize, and keep a level head,” added Desiree Delgadillo, postdoctoral researcher and one of the first authors.

The high resiliency group also had different microbiome activity than the low resiliency group. Namely, the high resiliency group’s microbiomes excreted metabolites and exhibited gene activity associated with low inflammation and a strong and healthy gut barrier. A weak gut barrier, otherwise known as a leaky gut, is caused by inflammation and impairs the gut barrier’s ability to absorb essential nutrients needed by the body while blocking toxins from entering the gut.

The researchers were surprised to find these microbiome signatures associated with the high resiliency group.

“Resilience truly is a whole-body phenomenon that not only affects your brain but also your microbiome and what metabolites that it is producing,” Gupta said. “We have this whole community of microbes in our gut that exudes these therapeutic properties and biochemicals, so I’m looking forward to building upon this research,” Delgadillo said.

The team’s future research will study whether an intervention to increase resilience will change brain and gut microbiome activity. “We could have treatments that target both the brain and the gut that can maybe one day prevent disease,” Gupta said.

A new UCLA Health study has found that resilient people exhibit neural activity in the brain regions associated with improved cognition and regulating of emotions, and were more mindful and better at describing their feelings. The same group also exhibited gut microbiome activity linked to a healthy gut, with reduced inflammation and gut barrier.

For the study, rather than examine microbiome activity and composition linked to disease conditions — like anxiety and depression — the researchers wanted to flip the script and study the gut microbiome and brain in healthy, resilient people who effectively cope with different types of stress, including discrimination and social isolation.

“If we can identify what a healthy resilient brain and microbiome look like, then we can develop targeted interventions to those areas to reduce stress,” said Arpana Gupta, PhD, senior author and co-director of the UCLA Goodman-Luskin Microbiome Center. This is believed to be the first study to explore the intersection of resiliency, the brain, and the gut microbiome.

Gupta and her team focused on methods to cope with stress because research has shown that untreated stress can increase the risk of heart disease, stroke, obesity, and diabetes. While stress is an inevitable part of life, studying how to handle stress can help prevent developing diseases.

To conduct the study, published in Nature Mental Health, the researchers surveyed 116 people about their resiliency — like trust in one’s instincts and positive acceptance of change — and separated them into two groups. One group ranked high on the resiliency scale and the other group ranked low. The participants also underwent MRI imaging and gave stool samples two or three days before their scans.

The researchers found that people in the h

Gupta and her team focused on methods to cope with stress because research has shown that untreated stress can increase the risk of heart disease, stroke, obesity, and diabetes. While stress is an inevitable part of life, studying how to handle stress can help prevent developing diseases.

To conduct the study, published in Nature Mental Health, the researchers surveyed 116 people about their resiliency — like trust in one’s instincts and positive acceptance of change — and separated them into two groups. One group ranked high on the resiliency scale and the other group ranked low. The participants also underwent MRI imaging and gave stool samples two or three days before their scans.

The researchers found that people in the high resiliency group were less anxious and depressed, less prone to judge, and had activity in regions of the brain associated with emotional regulation and better cognition compared to the group with low resiliency. “When a stressor happens, often we go to this aroused fight or flight response, and […]

Read more at www.sciencedaily.com

Unlocking the entrepreneurial brain: New perspectives on cognitive flexibility

In a recent study led by the University of Liège researchers delved into the intersection of the fields of entrepreneurship and neuroscience, looking specifically at the cognitive flexibility of habitual entrepreneurs — those who repeatedly launch new businesses — compared to less experienced entrepreneurs and managers.

Cognitive flexibility — the ability to adapt and shift from one concept or strategy to another — is crucial to entrepreneurial success. Understanding the neural basis of this characteristic can provide valuable information for improving entrepreneurial training and education. Recently published research suggests links between entrepreneurial behavior and brain structure, opening up new perspectives in the emerging field of neuro-entrepreneurship.

“Our study used a two-stage methodology,” explains Frédéric Ooms, Assistant Professor and first author of the study. First, we collected self-reported measures of cognitive flexibility from 727 participants, including entrepreneurs and managers. Next, we performed structural magnetic resonance imaging (MRI) on a subset of these participants to explore differences in gray matter volume in the brain. This multidisciplinary approach enabled us to correlate self-reported cognitive flexibility with actual brain structure.”

And what emerges first from the analyses is greater cognitive flexibility and brain differences between entrepreneurs and managers. Habitual entrepreneurs show an increase in gray matter volume in the left insula compared to managers. This brain region is associated with enhanced cognitive agility and divergent thinking, essential traits in entrepreneurship. The study also links gray matter density in the left insula to cognitive flexibility, particularly divergent thinking. “This finding suggests that the brains of habitual entrepreneurs are specially adapted to foster the cognitive flexibility needed to identify and exploit new opportunities,” explains Steven Laureys, neurologist at ULiège and Laval University.

This research has practical implications for educators and organizations. By recognizing the importance of cognitive flexibility, educational programs can be designed to cultivate this characteristic in aspiring entrepreneurs. Organizations can also benefit by fostering cognitive flexibility among managers, which could lead to more innovative and adaptive business strategies.

“This study is essential for entrepreneurship and neuroscience researchers, educators designing entrepreneurial training programs and business leaders wishing to foster innovation within their organizations,” resumes Bernard Surlemont, Professor of Entrepreneurship. By understanding the neural basis of cognitive flexibility, stakeholders can better support entrepreneurial success and adaptability.”

The discovery of distinct neural characteristics in habitual entrepreneurs not only advances our understanding of entrepreneurial cognition, but also opens up new avenues of research into how these brain structures develop and change in response to entrepreneurial activities. Longitudinal studies are underway to explore whether these differences result from innate predispositions or the brain’s plastic response to entrepreneurial experiences.

This pioneering research highlights the importance of combining neuroscience with traditional entrepreneurship studies to gain a comprehensive understanding of what makes successful entrepreneurs distinct at the neurological level. “As we continue to explore the role of the brain in entrepreneurship, this study represents an important advance in the field of neuro-entrepreneurship, concludes Frédéric Ooms.

Read more at www.sciencedaily.com

EPA proposing increased use of pesticide that’s BANNED in the EU and is 10x MORE TOXIC than other pesticides

EPA proposing increased use of pesticide that’s BANNED in the EU and is 10x MORE TOXIC than other pesticides

Tags: acephate , agriculture , autism causes , badhealth , badpollution , big government , brain damaged , brain health , Dangerous , Ecology , environment , EPA , EU , European Union , health science , insanity , organic farming , pesticides , poison , toxic chemicals , toxins , traitors The Environmental Protection Agency (EPA) is proposing increased use of acephate , a pesticide that is believed to be 10 times more toxic than most other pesticides. This proposal is worrisome because the EU banned acephate decades ago, mainly due to its potential to harm human health and the environment.

Acephate has been linked to health issues such as autism and lower cognitive performance.

If the EPA’s proposal is approved, it will ease restrictions on the pesticide. Acephate is commonly used on crops such as brussels sprouts , celery, cranberries and tomatoes .

Several studies have also found that acephate posed significant risks of acute toxicity, developmental neurotoxicity and environmental persistence. As a result, the pesticide was prohibited under strict regulatory standards designed to protect public health and ecosystems.

A thin coating of acephate is usually applied on the exterior of fruits and vegetables. The pesticide poses significant risks to consumers because it belongs to a class of compounds that have been linked to negative side effects such as autism and hyperactivity.

Scientists have also warned that those who consume foods treated with acephate tend to perform worse on intelligence tests compared to their peers.

Some studies have also linked acephate to developmental issues in both children and lab rats. Researchers believe that these negative outcomes are due to acephate’s ability to disrupt the transmission of signals between nerve cells.

With these findings in mind, the EPA’s proposal to reduce restrictions on acephate contradicts its mission to protect public health and the environment. The move also raises valid concerns about the safety of the food that Americans consume. (Related: Consumer Reports reveals certain food items carry high risk of PESTICIDE contamination .)

Human knowledge is under attack! Governments and powerful corporations are using censorship to wipe out humanity’s knowledge base about nutrition, herbs, self-reliance, natural immunity, food production, preparedness and much more. We are preserving human knowledge using AI technology while building the infrastructure of human freedom. Speak freely without censorship at the new decentralized, blockchain-power Brighteon.io . Explore our free, downloadable generative AI tools at Brighteon.AI . Support our efforts to build the infrastructure of human freedom by shopping at HealthRangerStore.com , featuring lab-tested, certified organic, non-GMO foods and nutritional solutions. Is acephate safe for human consumption?

The EPA’s proposal permits 10 times more acephate use on food than is currently allowed by federal limits. This proposal is driven by recent test results conducted on disembodied cells.

EPA representatives claimed that exposing cells to acephate revealed minimal or, in certain cases, no evidence that the pesticide is harmful. An agency spokesperson also said acephate generates a chemical that some believe compromises brain development after breaking down within the body.

The EPA designed new acephate tests with help from the Organization for Economic Cooperation and Development (OECD) to measure the impact of chemicals on the brain. However, the OECD has publicly acknowledged that the new tests are not reliable for finding out if a chemical alters the development of the human brain.

The EPA also allegedly talked to a panel of science advisors to determine the validity of the new testing methods. The panel concluded that the tests’ “inherent limitations do not accurately represent the mechanisms and processes that could compromise the development of the central nervous system.”

Scientists unanimously agree that toxicants, including the pesticide acephate, can potentially harm the development of children who are naturally sensitive to their environment and consumer products.

If the federal government were to consider these scientific findings, acephate would be banned. Instead, at least 12 million pounds of acephate are used on crops every year.

Visit Pesticides.news for more stories about how pesticides negatively affect human health.

Watch the video below to learn how pesticides and GM foods harm human health .

This video is from the oneninetyfivenationsrising channel on Brighteon.com . More related stories:

Latest EWG consumer’s guide reveals 2024’s DIRTY DOZEN and CLEAN FIFTEEN .

Big Ag pollution tied to pediatric cancers and birth defects .

Study: Global decline in male fertility linked to common pesticides .

New study: Exposure to PESTICIDES linked to METABOLIC DISORDERS like diabetes and obesity .

Sources include:

NaturalHealth365.com

EPA.gov

Brighteon.com Take Action:Support Natural News by linking to this article from your website.Permalink to this article:CopyEmbed article link:CopyReprinting this article:Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link). Please contact us for more information.

Read more at www.naturalnews.com

Sleep is essential for memory formation

Sleep is essential for memory formation

Article image Imagine you’re a student preparing for a big exam: do you pull an all-nighter or get some rest? As many students know, lack of sleep makes retaining information difficult.

Two new studies led by the University of Michigan (U-M) have recently revealed why this happens, what occurs in the brain during sleep and sleep deprivation, and how these processes impact memory formation. Neurons involved in memory formation

Specific neurons can be tuned to particular stimuli. For example, in a maze, rats have neurons that activate when they reach specific spots. These neurons help with navigation and are also active in humans. But what occurs during sleep?

“If that neuron is responding during sleep, what can you infer from that?” said Kamran Diba, an associate professor of anesthesiology at U-M Medical School, and senior author of both studies.

Diba and his team examined neurons in the hippocampus, a brain structure involved in memory formation, and visualized neuronal patterns associated with a location while an animal sleeps. Forming and updating memories

Sharp-wave ripples, a type of electrical activity, emanate from the hippocampus every few seconds during restful states and sleep . These ripples are thought to help neurons form and update memories, including spatial ones.

For their study, the researchers measured a rat’s brain activity during sleep after it completed a new maze. Using Bayesian learning, they tracked which neurons responded to which maze locations for the first time. Reactivation of neurons during sleep

“Let’s say a neuron prefers a certain corner of the maze. We might see that neuron activate with others that show a similar preference during sleep. But sometimes neurons associated with other areas might co-activate with that cell,” noted Diba.

“We then saw that when we put it back on the maze, the location preferences of neurons changed depending on which cells they fired with during sleep.”

This method allows visualization of neuronal plasticity or representational drift in real time and supports the theory that reactivation of neurons during sleep is crucial for memory . Lack of sleep and memory formation

Given sleep’s importance, Diba’s team investigated what happens in the brain during sleep deprivation.

In the second study, published in the journal Nature and led by Diba and former graduate student Bapun Giri, they compared neuron reactivation – where place neurons that fired during maze exploration fire again at rest – and their sequence (replay) during sleep versus sleep loss.

They found that neuron reactivation and replay of the maze experience were higher during sleep compared to sleep deprivation . A lack of sleep resulted in similar or higher rates of sharp-wave ripples but with lower amplitude waves and power.

“In almost half the cases, however, reactivation of the maze experience during sharp-wave ripples was completely suppressed during sleep deprivation,” noted Diba. Negative effects of sleep deprivation

When sleep-deprived rats caught up on sleep, reactivation rebounded slightly but never matched the levels of rats with normal sleep. Replay was also impaired and did not recover with regained sleep.

Since reactivation and replay are vital for memory, these findings highlight the negative effects of sleep deprivation on memory.

Diba’s team aims to further explore memory processing during sleep, the necessity of reactivation, and the impact of sleep pressure on memory. More about memory formation

Memory formation involves a complex interplay of neural processes that encode, store, and retrieve information. Encoding

This process begins with encoding, where sensory input is transformed into a neural code that the brain can use.

During encoding, attention and perception play critical roles in determining which information is processed further. Storage

Once encoded, the information moves to storage, where it is maintained over time. This can occur in short-term memory , which holds information temporarily, or long-term memory, which can store vast amounts of information for extended periods.

Consolidation is an essential part of this storage process, involving the stabilization of memories, often during sleep. Retrieval

The final stage is retrieval, where stored information is accessed and brought back into conscious awareness.

Retrieval can be influenced by various factors, including the context in which the memory was formed and cues that trigger recall. Brain structures

The brain structures most involved in memory formation include the hippocampus, which is crucial for consolidating new memories, and the cerebral cortex, where long-term memories are stored.

Additionally, the amygdala plays a role in emotional memories, highlighting the interconnectedness of cognitive and emotional processes in memory formation.—–Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.Check us out on EarthSnap , a free app brought to you by Eric Ralls and Earth.com.—–

Read more at www.earth.com

Sound stimulation with precise timings can help understand brain wave functions

Using sound to stimulate certain brain waves has the potential to help those with dementia or cognitive decline sleep better, reveals a new study. Sleep disturbances are a common feature in dementia and may affect up to half of people living with the condition.

During the study, the research team from the University of Surrey and the UK Dementia Research Institute Centre for Care Research & Technology at Imperial College London, used sound stimulation to target alpha rhythms, a type of brainwave, at precise timings of the wave to investigate how the brain responds.

Alpha rhythms have been associated with memory and perception, and changes to the rhythms have been observed in those experiencing cognitive decline and dementia.

Senior author Dr Ines Violante, Senior Lecturer in Psychological Neuroscience at the University of Surrey, said:

“Alpha oscillations are a defining characteristic of our brain’s electrical activity, but we still do not fully understand their role in shaping fundamental brain functions.

“Using sound is a powerful, non-invasive approach to stimulate certain oscillations within the brain. It is important that we find ways of manipulating these oscillations to create tools for treatment applications, as we know that brain oscillations are slower in diseases, such as Alzheimer’s disease.”

In a series of experiments, researchers used an innovative brain modulation technique known as Alpha Closed-Loop Auditory Stimulation (aCLAS), in which sounds are timed to the precise phase of alpha rhythms. To monitor the effect of stimulation, measurements of electrical activity from the brain were continuously read in real-time, and when a brainwave reached a particular phase, a sound (a burst of pink noise) was played on the participant.

Researchers observed that depending on the phase at which the sound was played, the alpha rhythm became faster or slower. The effect was also dependent on where the alpha oscillations were coming from in the brain.

Dr Henry Hebron, a former doctoral student at the University of Surrey and first author of the publication, said:

“What we have found is that alpha oscillations can be manipulated via sound when we address this rhythm on its own terms, using a closed-loop approach. Surprisingly, when we performed our aCLAS experiment as participants were falling asleep, we observed that sounds at a particular phase prevented them from reaching deeper stages of sleep (without waking them), while the same sounds at a different phase were not disruptive.

“There is a lot more to be explored regarding neural oscillations-dependent behaviours, and we believe closed-loop approaches, such as the one we implemented here, could be key.”

According to researchers, now they have shown they are able to influence Alpha waves with sound, the next steps will be to explore if they can modify the waves in such a way that enhances cognition and sleep, which could ultimately benefit dementia patients.

Professor Derk-Jan Dijk, Director of the Surrey Sleep Research Centre and Group Leader at the UK Dementia Research Institute Centre for Care Research & Technology Centre, said:

“There is much to be uncovered about the role of the alpha rhythm in sleep and cognition. This technique could be influential in pushing our understanding and improving sleep functions in those with dementia. We are now investigating the effects of this closed-loop auditory stimulation approach in REM sleep, where alpha rhythms are present but their role still unknown.”

The research contributes to the United Nations’ Sustainable Development Goal 3 — Good Health and Well-being.

Read more at www.sciencedaily.com

Human neuroscience is entering a new era — it mustn’t forget its human dimension

Human neuroscience is entering a new era — it mustn’t forget its human dimension

The field is taking a leap forward thanks to innovative technologies, such as artificial intelligence. Researchers must improve consent procedures and public involvement. Facebook

Email

Studies involving people who are awake during brain surgery are helping to explain how the brain produces and perceives speech. In neuroscience, ‘Broca’s area’ is a well-known part of the brain that is crucial for speech production. It is named after the nineteenth-century physician-researcher who discovered it — Paul Broca. Less well known, however, is the person whose brain enabled Broca to do so. His name was Louis Victor Leborgne and he had lost his ability to speak at age 30.

Leborgne’s story reminds us why we must never ignore the people involved, assume they’ve consented or fail to acknowledge them appropriately — especially in an age when a lot of neuroscientific research involves humans.

This week’s issue of Nature includes several studies devoted to human neuroscience. They highlight the opportunities researchers have to study the human brain in never-before-seen detail. For example, single-neuron recordings of people who are awake while undergoing brain surgery are helping to explain how the brain produces and perceives speech . Similarly, atlases of brain-cell types, neural circuits and gene-expression maps have the potential to revolutionize our understanding of the cellular and molecular processes that underline behaviour and cognition . Read the paper: Language is primarily a tool for communication rather than thought These technologies are helping researchers to explore what sets the human brain apart from those of other species, and how its cognitive abilities have evolved. For example, the role of non-invasive imaging in learning about cognitive abilities is discussed in a Perspective article by Feline Lindhout at the Medical Research Council’s Laboratory of Molecular Biology in Cambridge, UK, and her colleagues 1 . In another article, Evelina Fedorenko at the Massachusetts Institute of Technology in Cambridge and her colleagues also draw on this literature to argue that, in humans, language probably serves mainly as a communication tool rather than as a means for thinking or reasoning 2 — and that language is not a prerequisite for complex thought.

One desirable outcome for human neuroscience would be to develop personalized treatments for neurological and psychiatric disorders, because translating the results of studies in animals has not proved successful or sufficient for generating effective therapies at scale. But in grasping these opportunities, researchers must keep in mind that the brain is different from other organs — it’s the seat of people’s memory, experiences and personality. When using the human brain — whether in small cubes removed during neurosurgery, or through 3D organoids made from stem cells and grown in cultures to resemble parts of the developing human brain — for research, scientists must consider the dignity and respect owed to the individuals concerned. Read the paper: A molecular and cellular perspective on human brain evolution and tempo Read the paper: Large-scale neurophysiology and single-cell profiling in human neuroscience The 1964 Declaration of Helsinki is the basis of research ethics for studies involving humans. Participants are asked to complete a consent form before the start of a study. Researchers have to ensure participants are fully informed about the study’s goals and whether and how they will benefit from the research. Sources of funding should also be declared and a participant must be able to withdraw at any time. According to neuroethicist Judy Illes at the University of British Columbia in Vancouver, Canada, ideally, consent should not be something that is done only once. It should be revisited during a study, so that participants can make informed decisions at different stages 3 . This is especially important for studies involving vulnerable people, because their circumstances might change during a study.

In another Perspective article, Tomasz Nowakowski at the University of California, San Francisco, and a team of neurosurgeons, neurologists and neuroscientists 4 call on the neuroscience community to revisit these standards of ethical practice. A key challenge they identify is how to handle the ramifications of advances in machine learning and artificial intelligence (AI). Read the paper: Large-scale neurophysiology and single-cell profiling in human neuroscience Researchers who use cell atlases, single-cell technologies and spatial-genomic analyses benefit hugely from AI and machine-learning algorithms when analysing large data sets. Yet, AI technologies have the potential to re-identify anonymized information by analysing vast data sets and finding patterns that trace back to individuals. AI models that analyse large data sets can also make predictions related to features of peoples’ behaviour and their cognitive abilities. This has the potential to cause harm, for example, through biased or erroneous profiling of people on the basis of their neurological data, says neuroethicist Karen Rommelfanger, founder of the Institute of Neuroethics, who is based in Atlanta, Georgia.

Nowakowski and his colleagues propose that researchers use controlled archives, access to which requires approval, and that they restrict data use to the conditions specified in consent forms. To implement such changes will require conversations between study participants, academic researchers and the companies that have a considerable role in the current AI advances. Informed-consent information will also need to change, to account for the risks of researchers’ increased reliance on AI tools.

The team is right to stress the need for improved standards in data ethics and sharing that are jointly created by scientists, private partners and the research participants. Without a doubt, human neuroscience is entering a new and important era. However, it can fulfil its goals of improving human experiences only when study participants are involved in discussions about the future of such research. References

data-track-component=”outbound reference” data-track-context=”references section”> Lindhout, F. W., Krienen, F. M., Pollard, K. S. & Lancaster, M. A. Nature 630 , 596–608 (2024).

Article Google Scholar

Fedorenko, E., Piantadosi, S. T. & Gibson, E. A. F. Nature 630 , 575–586 (2024).

Article Google Scholar

Van der Loos, K. I., Longstaff, H., Virani, A. & Illes, J. J. Law Biosc. 2 , 69–78 (2014).

Article Google Scholar

Lee, A. T., Chang, E. F., Paredes, M. […]

Read more at www.nature.com

Thalamocortical connectivity crucial for functional brain networks, study finds

Thalamocortical connectivity crucial for functional brain networks, study finds

by Institute for Basic Science Thalamic connectopic maps (CMAP) and neocortical projection maps (NEOMAP) demonstrate the developmental changes in brain connectivity. Panel (a) shows CMAP 1 & 2 and NEOMAP 1 & 2 for infants (29–44 weeks), illustrating early differentiation of sensorimotor networks. Panel (b) displays these maps for children and young adults (8–22 years), highlighting the establishment of connections with the salience network and the differentiation between externally and internally oriented systems. Network profiles, sorted based on the Yeo-Krienan 7 Network Atlas, are depicted in box plots indicating the median and interquartile range (IQR). Panel (c) presents a schematic of the external-to-internal axis division derived from the NEOMAPs of childhood and young adulthood, showing the crucial role of the salience network. Credit: Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01679-3 Our brains seamlessly process streams of visual information from the world around us while simultaneously understanding the causal structure of events. These essential cognitive functions, known as external sensory processing and internal world modeling, are critical for navigating complex environments.

Our brain achieves this through large-scale functional systems responsible for these processes. Recently, an international collaboration of scientists led by the Institute for Basic Science (IBS) has explored the role of thalamocortical connectivity during the development of brain networks. The study is published in Nature Neuroscience .

One longstanding question in neuroscience is how the brain’s large-scale functional networks form during development. This study investigated the changes in connectivity between the thalamus and cerebral cortex from infancy to adulthood and how these changes influence the formation of the brain’s functional networks. For the first time, researchers have revealed that thalamocortical connectivity is crucial for the emergence and specialization of the brain’s functional networks, particularly those processing external and internal information.

Traditionally seen as a relay station for sensory information, the thalamus also influences higher cognitive functions. Sensory connections between the thalamus and cortex become established quickly at an early age, while higher-order cognitive connections develop later at maturity. However, the exact mechanisms and timeline of these developments have remained unclear.

This study began to address these challenges by employing advanced neuroimaging techniques, transcriptomic analyses, and computational models on cross-sectional and longitudinal datasets, to map the development of thalamocortical connectivity across different age groups.

This study revealed that during infancy, thalamocortical connectivity reflects early sensorimotor network differentiation and gene expression patterns related to brain development. However, as children grow, this connectivity shifts its role to establish connections with the salience network, that serves as an anchor for differentiating external (sensorimotor, visual, dorsal attention networks) and internal (default mode network ) functional cortical systems.

Computational simulations confirmed thalamic connectivity’s role in developing key features of the mature brain, such as functional segregation and the sensory-association axis. Perturbation of developmentally informed growth models. (a) A schematic illustrates the growth model based on thalamo-salience connectivity rules that change over the developmental age span. Four perturbation models were tested: perturbation applied to the 8–12 years age group, 12–16 years age group, 16–22 years age group, and all age groups, compared to a non-perturbation model. (b) The segregation indices (salience-external and salience-internal) were calculated for each growth model, with percentages indicating the difference compared to the no-perturbation model. (c) Cortical gradients extracted from the simulated affinity matrix of each growth model demonstrate the impact of these perturbations on brain connectivity development. Credit: Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01679-3 “Our study for the first time provides a detailed map of how thalamocortical connectivity contributes to the large-scale functional organization in the human brain from infancy through young adulthood,” said lead author Park Shinwon.

“By integrating advanced neuroimaging techniques, gene expression analysis, and computational modeling, we were able to systematically track and analyze the changes in brain connectivity across different developmental stages. This comprehensive approach has allowed us to uncover the pivotal role of the thalamus in the emergence and specialization of functional brain networks.”

Unlike earlier studies that focused on regional properties of individual thalamic nuclei, this research provides a comprehensive view of the global integration of the thalamus into cortical networks. These findings offer potential implications for understanding and studying clinical conditions that show compromised internal and external processing, such as autism, schizophrenia, and other neurodevelopmental conditions.

The corresponding author, Hong Seok Jun, a principal investigator at the IBS Center for Neuroscience Imaging Research said, “Understanding how thalamocortical connectivity evolves and influences brain function provides a crucial foundation for identifying the mechanisms underlying neurodevelopmental conditions.

“This research opens up new possibilities for early diagnosis and targeted interventions, which could significantly improve outcomes for individuals with neurodevelopmental conditions.”

In the future, the researchers plan to investigate how thalamocortical connectivity changes in children with autism and how these changes correlate with clinical symptoms and cognitive functions. They also plan to expand their research focus to include other subcortical structures such as the striatum and cerebellum.

This broader approach in systems neuroscience will help us gain a more comprehensive understanding of how various brain regions interact and develop.

More information: Shinwon Park et al, A shifting role of thalamocortical connectivity in the emergence of cortical functional organization, Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01679-3

Provided by Institute for Basic Science

Read more at medicalxpress.com

Nature Knows Nootropics