Researchers’ observations of neuron patterns could help amputees use prosthetics

Associate professor Nicholas Hatsopoulos pulls threads from biology, mathematics, statistics, and computer science in research that could be the key to restoring mobility to amputees.

By Louise Lerner

[img id=”77367″ align=”alignleft”] As associate professor Nicholas Hatsopoulos watches neurons fire, he pulls threads from biology, mathematics, statistics, and computer science in research that could be the key to restoring mobility to amputees. He works in the field of brain-machine interfacing, hacking the codes that the brain uses to orchestrate the body’s movements so that artificial machinery can recreate them.

Brain scientists have been studying the individual neuron for decades. Hatsopoulos takes a different approach: “We try to understand how collections of neurons in the motor cortex work together,” he said. “It’s like looking at a TV screen pixel by pixel—to see the whole scene, you have to step back.”

Hatsopoulos is specifically interested in voluntary movement, which takes him to the motor cortex. To understand how the brain directs movement, he and his team needed to tap directly into the signals being sent across the brain.

To record brain activity, Hatsopoulos uses a silicon chip with a hundred tiny spines. When in place on the brain, each spine picks up the electrical activity of one neuron. Hatspoulos displays the results on a screen, a hundred voltage spikes flickering in and out. It looks random, but it’s not.

They record the audio output, too. “It kinda sounds like Rice Krispies,” Hatsopoulos warned. And it does: snap, crackle, pop—your brain telling you to scratch your nose.

“They’re like binary ones and zeroes,” he said. “We try to understand how the patterns result in movements. When we’re looking at a hundred neurons, some are firing and some aren’t. That pattern ultimately tells your arm to move.”

The researchers set up trials with monkeys trained to play a video game. As the monkey moves a cursor after targets on a screen, the electrical signals he generates are recorded. Later, Hatsopoulos sifts through the data. To find the patterns, he and his team develop complex mathematical algorithms.

“Eventually, having collected all the data, can we foresee what someone is going to do based on the signals?” Their algorithm picks up the brain activity just before the monkey moves the cursor, and based on past activity, predicts where this particular set of signals will send the cursor. It’s very accurate. “We can kind of read their minds,” he joked.

Eventually, they replaced the actual cursor with the predictive one. “So the monkey’s moving the cursor with his brain,” Hatsopoulos said. “At some point he realizes, ‘Hey, I don’t have to move my arm!’ So he stops.”

Hatsopoulos has created a cursor that can be controlled with the brain alone. The idea has immediate applications for anyone whose brain remains intact while the muscles can’t move—like paraplegic patients with spinal cord injuries, Lou Gehrig’s disease (ALS), or damage from strokes.

“Even if the spinal cord is cut, signals still go out from the cortex,” Hatsopoulos said. “So if we put in a device that picks up those signals and decodes them, we can predict what he wants to do.”

Several years ago, Hatsopoulos got FDA approval to test his device on four patients—two with spinal cord injuries, one with ALS, and one stroke patient.

“We found the patient could indeed voluntarily get neurons to fire just by thinking about it,” he said. The patients moved a cursor across a screen with their minds, controlling channels and volume on a television screen.

Though the test was a success, keeping a permanent wire into a patient’s brain is an infection risk. Hatsopoulos thinks the next step is to build wireless transmitters, which would only require a one-time surgery.

In addition, Hatsopoulos wants to go beyond the 2-D video screen and into the far more complex task of 3-D reaching and grabbing. Instantly the game must involve joints and angles, the sense of touch, and another sense called the kinesthetic sense, which uses its own sensors in muscles, tendons, and joints to tell you where your body is in space.

“Think about it: if I close my eyes and someone else moves my hand, I still know where it is,” Hatsopoulos said. “It turns out to be critically important to movement.” Patients who lose this sense can have difficulty walking, even though the muscles remain intact. “They have to actually watch their legs as they move back and forth.”

So creating a prosthetic limb for an amputee patient would require a set of sensors in the limb itself, which would relay kinesthetic and touch sense information about the limb’s movement back to the brain.

Hatsopoulos thinks that eventually one might even be able to use electrical current to stimulate a patient’s own paralyzed limbs to create movement, or a full-body suit à la Iron Man, he speculated, which could move the arms and legs of paraplegic patients.

But for now, the ideas are only ideas. Hatsopoulos continues to refine his algorithms. “In the end,” he said, “I’m interested in figuring out how the motor cortex controls movements, like dancing, playing the piano—they’re so complex. And when we help someone along the way, so much the better.”