A team of researchers at the University of California San Diego and Stanford University has received a $7.5 million, five-year grant to try to answer two fundamental questions: what is the memory capacity of a brain; and how does the brain process information with maximum energy efficiency. The grant was awarded by the Air Force Office of Scientific Research through a Multidisciplinary University Research Initiative (MURI).
The team is led by Padmini Rangamani, an assistant professor in the Department of Mechanical and Aerospace Engineering at UC San Diego who is an expert in theoretical and computational biophysics. In addition to Rangamani, the team includes UC San Diego researchers Terry Sejnowski, a well-known neurobiologist and expert in computational neuroscience at the Salk Institute and UC San Diego; Ralph Greenspan, the co-director of the Cal-BRAIN initiative and an experimental neuroscientist; and Shelley Halpain, a neurobiology professor who specializes in cytoskeletal remodeling in neurons. The team also includes Daniel Tartakovsky, an expert in computational mathematics at Stanford University.
“How do living organisms make complex decisions based on limited information and a low energy budget? This question is central to not only evolution, complexity of animal life and our day-to-day lifestyle, but also to our understanding of how we function,” said Rangamani, the grant’s principal investigator. “The answer to these complex but core questions lies at the heart of what separates animals from machines. Our work could lead to devising new technologies that are energy efficient and make decisions efficiently.”
A good way to explain the researchers’ work is to consider the difference between a car and an animal’s brain. A car needs fuel to run and is either on or off. On the other hand, the brain is constantly on and can function even if it’s running out of fuel—for example when an animal is deprived of sleep. The brain only starts to make mistakes under extreme duress, such as extreme sleep deprivation or starvation. The goal of the project is to understand why and how this happens.
“The human brain is more powerful than any manmade computer but it only runs on 20 watts of power,” said Sejnowski. “We have much to learn from nature about building fast and energy intelligent computers.”
Over the next five years, researchers will take a multi-disciplinary, multi-scale approach to solve this puzzle. They will conduct experiments in increasingly complex cell cultures and in fruit flies. This will lead them to develop the theoretical and computational framework that will allow them to identify the energy and information mechanisms in these systems.
“This is an exciting project bringing together some of the most original and talented scientists from a range of fields to address one of the most important questions today in neuroscience: how does the brain process information so effectively and at the same time so efficiently,” said Greenspan.
Researchers say they believe their work will impact the scientific community’s understanding of the molecular biology of how neurons use energy; sources and sinks of energy consumption in neurons; and how memory storage is affected by energy availability in single dendritic spines and neural circuits.
“We expect the results of this work to push the boundaries of the field,” Rangamani said.
Over the next five years, the team has five research goals: characterizing molecular pathways, energy transduction and signaling in neurons; characterizing energy sources and coupling with cytoskeletal dynamics in the dendritic spine; understanding the impact of molecular pathways on multi-cellular responses and neural circuits; understanding the relationship between circadian rhythm and brain energy homeostasis in living animals; and understanding informational quantification and processing from organelles to organism.