Research speeds development of brain-controlled devices that may allow amputees and paraplegics to use their limbs

New research is speeding the development of brain-controlled devices that may soon allow amputees and paraplegics to use their limbs. Within a few short years, these so-called brain-computer interfaces (BCI) may also allow people completely paralyzed by neurodegenerative diseases to regain some movement or ability to communicate with those around them.

In recent work, scientists created a BCI that collects information from hundreds of thousands of brain cells at once. In other new work, researchers have used electrodes that do not penetrate the brain to record brain activity that can control a BCI. Swedish scientists have greatly refined a prosthetic hand. And other scientists have trained a monkey to use a robotic arm controlled by its own thoughts.

The first brain-controlled movement came several years ago, with patients moving objects in virtual reality. Now four groups of scientists have built upon the earlier studies to bring the field closer to prosthetic devices controlled by thought. “We are rapidly approaching a milestone,” says Andrew Schwartz, PhD, of the University of Pittsburgh Department of Neurobiology, referring to his work with a new anthropomorphic robotic arm.

Neuronal activity measurements are used to control BCIs. They are recorded either from individual brain cells (called single-unit recording) or from the scalp using electroencephalography (EEG). The recorded brain signals are then used to control a physical or virtual device that carries out a task according to the user's intent.

Both methods have their advantages and drawbacks. Recording from inside the brain requires that electrodes be surgically implanted, carrying the risk of infection. Neural scarring around the electrodes can also build up over time and cloud the data used to control the device. EEG measurements, on the other hand, are safe, but yield a much 'fuzzier' signal. In addition, users typically have to be trained for prolonged periods to master this new skill.

Scientists are now working to develop techniques that combine the advantages of both methods yet reduce their downfalls. Scientists in the laboratory of Richard Andersen, PhD, of the California Institute of Technology have developed a method that collects local field potentials (LFPs).

Instead of recording from individual neurons, the researchers implanted electrodes that collect information from hundreds of thousands of brain cells at once. Neural scarring affects all chronically implanted electrodes over time. But because LFP signals are not dependent on one particular cell, the signal is not affected nearly as much as single-unit recordings are. “Physically and informationally, LFPs fall between single-unit recordings and EEGs,” says Daniella Meeker, BA, one of the researchers.

The researchers implanted just one electrode in a macaque monkey to record LFPs in the parietal cortex in each of 24 experiments. The parietal cortex is a brain area just upstream of the motor cortex in planning movements.

The monkey was trained to plan its movements without actually executing them. In each experiment, the monkey prepared to make its movements while the researchers recorded the LFP from a single location. This activity pattern was then used to program the BCI. Next, the monkey controlled the BCI with its thought pattern to move a cursor to a desired location on a computer screen.

Although the monkey's performance was not as good as it would have been with single-unit recordings, performance improved with training over time. The group plans to extend the study to include multiple recording sites, and to combine LFP measures with single-unit recordings. “By using signals from several different brain areas, we hope to optimize BCI control over time,” says Meeker.

In other work that seeks to combine the benefits of single-unit and EEG recordings, Gerwin Schalk, MS, a computer scientist at the Wadsworth Center of the New York State Department of Health, has literally gone halfway between the techniques to control a BCI.

Instead of recording activity from within the cortex or from the scalp, Schalk and his colleagues at Wadsworth and at Washington University in St. Louis recorded from the surface of the brain—inside the skull. This technique, called electrocorticographic (ECoG) recording, uses electrodes that do not penetrate the brain.

In earlier work, the group showed that humans could use ECoG recordings to control a cursor on a computer screen in one dimension, in an up-down motion. The current work extends this study to control movement in two dimensions. The researchers collected their data from an epileptic patient who was about to undergo therapeutic brain surgery. In routine preparation for the surgery, electrodes were implanted on the brain surface to monitor its activity.

The patient made movements—real and imagined—while the scientists recorded his brain activity and watched which signals changed during the tasks. The patient then fine-tuned these signals and used them to control a cursor on a computer screen using only his mind. It took only a few minutes of training to acquire this new skill—a vast improvement over the days or even weeks required to master an EEG-controlled BCI. ECoG recordings also have an advantage over single-neuron recording electrodes in that they are less invasive, and may carry fewer risks than those implanted into the brain.

“With a BCI controlled by ECoG signals, patients who have lost the ability to communicate could use a word processor, surf the Internet, or even control a prosthesis,” Schalk says. The group plans to extend their findings in more subjects, and hopes to develop a clinical device based on ECoG recordings.

New work from a group at Lund University in Sweden uses electromyography signals to improve hand and arm prostheses.

Nils Danielsen, MD, PhD, and his colleagues recorded signals from individuals who controlled the individual finger movements of a virtual hand while wearing a “data glove.” The glove had 18 joint position sensors and allowed for natural movements of the hand. A computer recorded the joint angle positions as the subjects moved. At the same time, electrodes on the arm measured their arm muscles' electrical activity, or EMG. Together, this information was used to “train” the control system.

While the training input arrives simultaneously from the glove and the muscles, the “self-learning” computer integrates this information and fills in the blanks. The computer was able to “learn” in real time, while the subjects were moving. “The control system could predict new unknown EMG recordings during the training period,” Danielsen says. “After a long enough training period, the data glove can be removed, and the person can control the virtual hand using real-time recorded EMG signals, relying on the previously stored data.”

In the case of a single amputee, the data glove would be placed on the remaining hand, and the EMG electrodes would record from muscle activity in the amputated stump. The researchers expect that these improvements will lead to development of next-generation hand prostheses.

Schwartz and members of his research team—postdoctoral fellows Meel Velliste, PhD, and Beata Jarosiewicz, PhD, along with graduate students Chance Spalding and Gordon Kirkwood—have trained a monkey to use a motorized prosthetic arm to retrieve pieces of orange or cucumber. The lab had previously trained monkeys to control the movement of a 3D cursor in a virtual reality display, and other groups have taught monkeys to control 2D computer cursors and other robotic devices. Earlier studies required an initial arm movement measurement, which is obviously not possible for paralyzed subjects. This step was unnecessary in the current study.

In the current study, the prosthetic arm was positioned near the monkey's shoulder and made to simulate natural movement. The monkey moved the arm to bring a piece of fruit to its mouth. The investigators recorded the activity of neurons in the motor cortex, the area of the brain where intentional movements originate, to provide neural control of the robotic device. Each of these neurons has a preference for movement in a specific direction. When a motion is made—or even planned—in a cell's preferred direction, it increases its firing rate, or activity. “This robust relationship between motor cortical cell activity and intended movement direction allows us to use motor cortical signals to control the prosthetic device,” says Velliste.

Although the new technology controls the speed and direction of the “hand,” computers calculated the details of the arm's movements—such as the joint angles and the forces needed to move the arm. Future work from the group will examine more complex tasks, requiring precise movements of the arm, hand, and fingers. “Eventually, we hope to use this technology to provide disabled patients with the ability to interact naturally with their environment,” Spalding says.

http://apu.sfn.org/

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Spatial aging clocks reveal how T cells and neural stem cells shape brain aging