"This work really gives a boost to the efforts to produce a workable brain-controlled prosthetic for people with paralysis," said Susan P. Howley, director of research for the Christopher Reeve Foundation, which provided funding for the research.
The prosthesis that senior author Krishna Shenoy, PhD, assistant professor of electrical engineering and of neuroscience, and his team are working on is called a brain-computer interface. The idea is to attach electrodes to a person's head to record brain waves and send them to a computer, which uses an algorithm to translate the signals into commands to control the prosthesis.
Although it sounds like it's straight out of science fiction, experiments in the 1960s showed that, in principle, it can be done; there are, however, two significant hurdles to developing a workable brain-computer interface for a prosthesis.
The first is simply making sense of the signals generated by neurons, the brain cells that create thoughts within the brain and transmit them down the spinal cord and out to the peripheral nerves by means of electrical impulses. The second challenge, where Shenoy's team members applied their new approach, is interpreting those signals with enough speed and accuracy to make the interface practical to use for a patient.
The standard approach to processing neural impulses has been to collect and translate them every step of the way as the subject thinks about moving the prosthesis from point A to point B. That's a valid approach, said Shenoy, if the user is doing things requiring continuous movement, such as drawing a line.
But all that collecting and processing slows down the prosthesis, and, for many tasks, such as typing on a keyboard or turning off a light switch, it's not about the journey, it's the destination that counts. In other words, if you want to get from New York City to Los Angeles, it makes sense to skip the scenic drive via Bugtussle, Ky.; you'll reach your destination a lot faster if you just fly direct.
Shenoy and his colleagues set out to shorten the process by focusing on the end point rather than processing every step along the way. They hoped to accurately forecast an intended target based on the signals the neurons sent out when the subject only thinks about moving an arm to that target.
The researchers worked with rhesus macaque monkeys in their experimental work. The monkeys were connected to the interface by a tiny silicon chip, less than one-tenth the area of a penny, holding 100 electrodes. The electrodes were implanted in the pre-motor cortex, which is on the surface of the front part of the brain and is one of the areas responsible for guiding a person's or a monkey's arm. The monkeys were trained to face a computer screen, with one finger touching a central starting point and their eyes focused on another starting point nearby. When a target spot lit up elsewhere on the screen, the monkey knew that he was supposed to touch the target spot -- but only when another on-screen signal told him to. Until the "go" signal was given, the monkey waited.
This waiting period was the critical phase in collecting the data for analysis. The brain waves the monkeys generated during this hiatus, when they were only thinking about moving their arms to the target spot, simulated the neural signals a paralyzed person would generate while thinking about moving a prosthetic arm or cursor to a particular spot, yet not physically doing so.
The challenge was achieving the right balance. On the one hand, the scientists wanted the computer system to use as brief a neural signal as possible, recorded while the monkey was anticipating touching the target. On the other hand, they wanted to ensure that the system had enough information to predict the correct location of that target. In other words, the researchers wanted to find the "sweet spot," the point where the system would process the brain waves in the best balance of time and accuracy for a prosthesis.
As Shenoy and his colleagues saw hints of a sweet spot in their data, they deliberately ran tests at that speed. And ultimately they arrived at a result that was far superior to what others had previously achieved. "You can quantify that sweet spot in terms of the rate at which the system is extracting data from the brain just the way you measure the data-transmission rates of computer modems," explained Gopal Santhanam, PhD, who did his graduate work in electrical engineering in Shenoy's laboratory and is one of two first authors of the Nature paper.
An accompanying paper in the same issue of Nature reports on the work by another research group, at Brown University, which has been working with human patients with spinal cord injuries to show that even years after an injury, they still have the needed neurons to control a prosthesis. Shenoy said that, in combination, the work of the two groups makes the prospects "quite bright" for developing functional prostheses that patients could control with their thoughts.
Shenoy said the study proves it's possible to process neural signals fast enough to be useful to a paralyzed patient, adding that data-transmission rates can be used to approximate the number of words per minute the prosthesis would allow a user to type. Previous methods topped out at a few words per minute, but the end-point approach of his group peaked at 15 words per minute. That might not be speedy enough to land their brain-computer interface a job in the steno pool, but it's fast enough that a person could probably use it to communicate with the rest of the world without undue frustration. And though one might say that is the real end point of all their efforts, Shenoy thinks they can do better. "We really are viewing this as a starting point," he said.
Other authors of the paper are electrical engineering graduate students Byron Yu and Afsheen Afshar, who is also a medical student.
1 comment:
I love your website. It has a lot of great pictures and is very informative.
»
Post a Comment