Lighting Up the Brain

Lighting Up the Brain
Click here to view original web page at www.bu.edu

Even family pets can inspire his inventive spirit. Whenever Boas’ wife traveled for work, she’d always call to make sure the cats were safely back inside for the night. Rather than policing the kitty door, Boas started puttering. He sketched out a system, built around an open-source microcontroller, to automatically log each cat’s return and report back.

“He decided to build us, not a simple camera, but a sensor that detects and recognizes which cat is entering by the spectral signature of their fur,” says his wife, Maria Angela Franceschini. As Boas tinkered with his tabby tracker, he gently encouraged the pets in and out of the door—over and over again. “He terrorized my cats,” jokes Franceschini, also a leading neurophotonics researcher and an associate professor at Harvard Medical School. “And there were wires everywhere. You have no idea how many projects he starts.”

“David is the pioneer of techniques and methods to use light to interact with the brain.”

The Neurophotonics Center, the first facility of its kind in the United States and only the second in North America, pulls in 30 faculty from fields as diverse as biology, mechanical engineering, brain sciences, and nanomedicine. Its mission, says Boas, who formerly taught at Harvard Medical School and was the founder of the Optics Division of the Martinos Center at Massachusetts General Hospital (MGH), is to cultivate technologies that give researchers new insights into the brain. Most of Boas’ work is funded by the National Institutes of Health (NIH) and feeds into its ambitious BRAIN Initiative, a decade-long, multibillion-dollar project to speed the development and application of innovative neurotechnologies.

Since opening in fall 2017, the BU Neurophotonics Center has started studies analyzing the brain as it recovers from a stroke, confronts autism, and slides into dementia. It’s also helping to nurture a community of student neurophotonics researchers with a $2.9 million National Science Foundation Research Traineeship.

“David is the pioneer of techniques and methods to use light to interact with the brain,” says Thomas Bifano, director of the BU Photonics Center. “He’s going to make a difference in our understanding of the brain, specifically by making tools that allow us to see it in ways we haven’t seen it before.”

Watching Brains Eat

The human brain is a voracious eater, gobbling up about 20 percent of the body’s oxygen supply. Whenever you think, feel, or act, oxygen-rich blood rushes to the part of the brain doing the work, fueling your thoughts. It’s that literal rush of blood to the head that Boas, a professor of biomedical engineering, is able to track with light.

In recent projects, Boas has used infrared light—shone into the head by a functional near-infrared spectroscopy (fNIRS) machine—to see human brain activity during surgery, memory creation and retrieval, and even the humble headache. In fNIRS studies, sensors called optodes—“It’s like an electrode, but it’s optics rather than electronics,” says Boas—are placed on a subject’s head, often attached to something that looks like a swim cap. The optodes transmit infrared light, which can travel about 5 to 10 mm beneath the skull, into the cortex. Some of the light is absorbed by hemoglobin, the oxygen-delivering protein in red blood cells, and the rest bounces back to the surface, where additional optodes detect it. Boas compares it to holding a flashlight against your hand and watching it glow red.

Neurophotonics researcher David Boas connects research fellow Xinge Li to a functional near-infrared spectroscopy (fNIRS) machine.
David Boas and his team, including research fellow Xinge Li, can look 5 to 10 mm beneath the skull using functional near-infrared spectroscopy machines, which include flexible caps fitted with optodes that transmit and detect light.

By sending infrared light at two wavelengths, researchers can track and map neural activity through shifts in the levels of oxy- and deoxyhemoglobin, blood cells packed with—or stripped of—oxygen. The initial feedback looks similar to an electrocardiogram, with moving lines of blue (deoxyhemoglobin) and red (oxyhemoglobin) bouncing across the screen as blood levels change. Researchers can work with the raw data from each optode or use it to plot a heat map of brain activity.

“I study hemodynamics as a surrogate of brain activation,” says Boas. “I measure blood flow because I’m very interested in how oxygen is delivered to the brain, how it’s consumed by the brain.”

Boas says that’s because the first neuroscientist he worked with at MGH was a stroke researcher. Together, they pioneered the application of laser speckle contrast imaging—which measures shifts in a pattern of light—in neuroscience, using it to map blood flow during a stroke.

“Stroke is all about insufficient oxygen delivered to the brain. Oxygen delivery and consumption is really a mass balance equation, which is good for a physicist who likes simple problems,” says Boas, whose doctorate is in physics, but who taught electrical engineering and computer science at Tufts, then radiology at Harvard Medical School. Track the oxygen and you can figure out how the brain is reacting not only during or after a stroke, but during just about every other neural activity. “Oxygen comes in and goes out and the difference is what was consumed by the tissue. I can understand that and so I developed the tools to measure that.”

One of those tools is an fNIRS machine. When Boas got into neurophotonics, in the early 1990s, it was such a new field—only 40 academic papers had been published on the subject—he had to build his own. Today, you can buy an fNIRS machine off the shelf, including the one he developed. Boas says about 100 of his fNIRS systems—commercialized by a company called TechEn—have been distributed around the world.

A Boas-built fNIRS machine sits in a sparsely decorated test room on the first floor of the Rajen Kilachand Center for Integrated Life Sciences & Engineering. Plastic boxes on the shelves are stuffed with the fibers used to relay signals from fNIRS optodes to a box resembling a high-end audio system. The box is lined with laser diodes and photodetectors and hooked up to a PC, which helps translate the signals. Boas, who founded the Society for functional Near-Infrared Spectroscopy and the journal Neurophotonics, wrote the software most widely used for decoding fNIRS signals (the latest version is called HOMER2).

“We don’t give beautiful pictures of the brain,” he says, contrasting fNIRS to functional Magnetic Resonance Imaging (fMRI), which produces lush snapshots of brain cross sections, “but we give really good functional maps of what’s happening.”

“We don’t give beautiful pictures of the brain, but we give really good functional maps of what’s happening.”

It’s also considerably cheaper and quicker than fMRI—and doesn’t confine subjects inside a clunking appliance; they can travel as far as the fibers will let them.

Boas’ latest study, funded in part by the National Institute of General Medical Sciences, examines the potential of fNIRS to objectively monitor pain, particularly in the operating room. When you’re knocked out for surgery, the anesthetic means you’re not conscious of the excruciating slicing and stitching, but the pain is still there, so you’re dosed with analgesics, too. According to a study in the Annals of Surgery, anywhere from 10 to 40 percent of surgery patients wake with persistent—chronic, hardwired—pain; for some that may be because they didn’t get enough analgesia while under the knife: “You may not consciously remember it, but your body becomes sensitized to that feeling,” says Boas. Using fNIRS, Boas and David Borsook, director of the Pain and Imaging Neuroscience Group at Boston Children’s Hospital, found they could accurately measure pain levels by placing optodes on the motor sensory and prefrontal cortexes, then using infrared light to watch the brain process the discomfort. When the researchers applied heat or electrical stimulus to the hand or face, they saw an increase in oxyhemoglobin in the motor sensory cortex—unless the subject had been given morphine. Those given the painkiller didn’t register the stimulus as painful.

“The next step is to take that into the operating room,” says Boas. An impartial measure of discomfort could also have broader applications, such as figuring out pain levels in infants or people with dementia.

Not all of his work focuses on lighting up human brains. Boas has also made advances in microscopy, using light to look inside the brains of animals. That’s unusual, says Meryem Ayşe Yücel, a research assistant professor at BU, who followed Boas from MGH: most neurophotonics researchers specialize in one part of the field, just studying humans, say, or developing new technology. In 2004, for example, Boas was the first to use optical coherence tomography, a technology that uses reflected light to build 3-D maps of blood flow, to measure brain function. Today, he’s applying it during in vivo animal model studies to test therapies that could reverse faltering blood flow after a stroke and improve patient outcomes.

For researchers at BU, having an fNIRS expert on campus is already opening new possibilities. Boas has started a handful of fNIRS projects with faculty across BU and expects to begin another eight this year. Some examples: Robert Stern a School of Medicine professor, is exploring the technology’s potential as a screening tool for early-stage Alzheimer’s disease; Swathi Kiran, a College of Health & Rehabilitation Sciences: Sargent College professor, is using it to measure how the brain responds in real time to language therapy after a stroke.

“We know very little about the brain and so neuroscientists are always hungry for new tools to help them better understand it,” says Boas. “What’s wonderful about coming to BU is I have all of these basic cognitive scientists who are really excited about adopting the technology. They knew about the technology, but they didn’t have access to it and it was too much of a barrier for them to figure it out, but now that I’ve come here and we’re starting the center, it’s really easy to support.”

Two mannequin heads sit on tables in a BU Neurophotonics Center work space where new tools and technologies are built and tested.
The BU Center for Neurophotonics isn’t just a lab for testing existing equipment, it’s a space for building and incubating the latest technologies. Researchers are currently using 3-D printers to customize the caps worn by fNIRS test subjects; even the model heads, used to check fit and positioning, are printed.

One project, using fNIRS to look into the brains of people with autism, finished its pilot phase in January 2018. The Neurophotonics Center collaborated with Helen Tager-Flusberg, director of the Center for Autism Research Excellence, on a study of the mirror neuron system, a circuit that clicks into action when executing a task or when watching someone else do the same. It’s “considered a foundational mechanism that underlies social understanding and interaction,” says Tager-Flusberg.

With Yücel, she used fNIRS to monitor healthy adults completing—and observing others completing—two tasks, one straightforward (mailing a card through a wide mail slot), one a little tougher (a narrow mail slot). fNIRS showed “the core regions of the brain associated with the mirror neuron system were activated in both execution and observation of the actions,” especially during the tougher task, says Tager-Flusberg. Having piloted the study with healthy adults, she plans to adapt it for use with children, and then children with autism.

“We’d like to know how early we see similar kinds of brain activity underlying this capacity to link action and perception of action in young children using fNIRS, because no one has done that using this modality before,” says Tager-Flusberg, who has used electrophysiology—which measures electrical activity to show the timing, but not the location, of brain activity—for similar research in the past. “If David had not come here and opened the Neurophotonics Center—and been so inviting to collaborators—I would never be doing these studies.”

True to his inventing spirit, Boas is already building a better fNIRS system. “This box is disappearing,” he says of the stereo-size rig with its dangling stream of fiber optic cables. “We’re now building wearable systems. The electronics have gotten so much better that we don’t need this big box: we can put the light sources, photodetectors, digitizers, and minicomputers on the head.”

Postdoctoral fellow Bernhard Zimmermann works on a new wireless fNIRS system in the BU Neurophotonics Center lab.
Postdoctoral fellow Bernhard Zimmermann is working with Boas on a new fNIRS system that ditches the tangle of cables in many current setups. It will allow researchers to study human interaction outside of the lab.

In the Neurophotonics Center’s second lab space in the Life Science and Engineering Building, two rooms are dedicated to making prototypes of the new portable system. The first is a tinkerer’s dream. One wall is covered in small workshop drawers; opposite, a 3-D printer stands ready with spools of plastic thread. A dozen small pieces of numbered bendy plastic—each one flat, red, about the size of a smartphone, and crisscrossed with a diagonal grid—are laid out next to the printer. Students in the lab are testing the flexible plastics for use in the system’s cap.

In an adjacent room, Bernhard Zimmermann, a postdoctoral fellow and another MGH recruit, is working on designs for the souped-up optodes that will form the system itself. Each optode—complete with circuit board and all the light-throwing and light-grabbing technology currently housed in the big box—will be smaller than a penny.

“To reach that small size, we need to use the latest technologies from smartphones,” says Boas, who already holds nine patents. “The investment per generation is quite big, so we want to be sure we get it right the first time. It’s challenging, but I’m pretty confident.”

He expects the first prototype to be ready this year and to have built 1,000 optodes—enough for about 20 complete systems—within five years. The project is funded by the NIH: “It’s an interesting grant,” says Boas. “It’s to build and disseminate.”

Controlling the Brain with Optogenetics

Light isn’t just useful for gazing into the brain, it can control it, too. In optogenetics, scientists genetically reprogram cells to respond to light. Researchers at BU are already using it routinely in animals to trigger brain activity and change memories; Boas predicts the technology will move to humans in his lifetime.

While other wearable systems are already available, they’re shipped with optodes fixed in place and offer limited coverage of the brain. The BU-developed version would work in conjunction with a custom-made cap generated using the 3-D printer and flexible plastics, allowing researchers to decide where the optodes should go and which parts of the brain to illuminate. Existing fNIRS machines like the one Boas first developed can cost up to $250,000, but he says the new system should be closer to $5,000.

For researchers like Tager-Flusberg, it will mean the chance to move brain studies out of the lab and into the real world.

“With the mobile technology he’s developing, you’ll be able to use fNIRS to study the interaction between two people, and we know so little about that,” she says. “It will really allow us to study social phenomena, social engagement in a way we never have before.”

Boas is also positioning the Neurophotonics Center to act as an incubator for other new technologies, connecting photonics whizzes and their tech breakthroughs to neuroscientists with problems to solve. The center will help with studies to evaluate the technology in the field and then, when it becomes more advanced, move it out to individual labs across the University.

His next idea? Using soundwaves. Boas has just received an ultrasound machine, which he says will be able to measure blood flow in an entire mouse brain. It might not use light to illuminate the brain, he says, but it’s still all about waves. “I’m super excited about it. Even though we’re not ultrasound people, we can take ideas from optics and apply it and do really innovative stuff.”

Spread the love

Leave a Reply

Nature Knows Nootropics