New breakthrough in the development of more natural and continuous BCI control systems

For the first time ever, the intention of a continuous movement was able to be read out from non-invasive brain signals at TU Graz. This success enables more natural and non-invasive control of neuroprostheses to be carried out in real-time.

Intended to give paraplegic people back some freedom of movement and thus a better quality of life, so-called brain-computer interfaces (BCIs) measure the person's brain activity and convert the electrical currents into control signals for neuroprostheses. "Controlling by thoughts," as Gernot Müller-Putz puts it in simplified terms. The head of the Institute of Neural Engineering at Graz University of Technology (TU Graz) is an "old hand" of BCI research and is intensively involved with non-invasive BCI systems.

He and his team have achieved initial positive results with EEG-based control of neuroprostheses or robotic arms in people with spinal cord injuries over the last ten years. However, until now the control was unnatural and cumbersome because the thought patterns had to be repeatedly imagined. As part of his recently completed ERC Consolidator Grant project "Feel your Reach", Müller-Putz and his team have now achieved a breakthrough in the development of more natural and continuous BCI control systems.

It all comes down to seeing

The TU Graz researchers have succeeded for the first time in controlling a robotic arm purely by thought in real-time in the usual non-invasive way using an EEG cap. This was made possible by decoding continuous movement intention from brain signals - something previously impossible. The researchers first examined a variety of movement parameters such as position, speed, and distance, and extracted their correlates from the neuronal activity.

The contribution of the eyes is essential here. It is important that users are allowed to use their eyes to follow the trajectory of the robotic arm."

Gernot Müller-Putz, Head, Institute of Neural Engineering, Graz University of Technology

However, eye movements and eye blinks generate their own electrical signals, so-called ocular artifacts in the EEG. "These artefacts distort the EEG signal. They therefore have to be removed in real time. However, it is essential that eye-hand coordination can take place and thus contribute to the decoding of movement requests," Müller-Putz explains. In other words, the visual information helps to capture the intention to move. The unwanted signals of the eye itself, however, have to be filtered out of the electrical activity arithmetically.

BCI detects unwanted movements

It is also essential that one of the BCIs developed by the researchers is able to recognize whether a person wants to start a movement - it can recognize the start of a goal-oriented movement. In addition, another of the research team's BCIs detects and corrects errors, i.e., unwanted movements of the robotic arm; one more piece of the puzzle for a more natural prosthetic control. "The brain's error response can be read off from the EEG. The BCI recognizes that the movement performed does not correspond to the person's intention. It stops the movement of the robotic arm or resets it to the beginning," says Müller-Putz. In the project, error detection was successfully tested several times in tests with spinal cord injured persons.

People can feel movements of the robotic arm

The TU Graz researchers were also successful with so-called kinaesthetic feedback. "The participants not only see the movements of the prosthesis, they also feel them," says a visibly pleased Müller-Putz. Technically, this was made possible with the help of vibration sensors. These are stuck to the skin on the shoulder blade and track the movements of the robotic arm in finely flowing vibrations. Theoretically, it is also possible for completely paralyzed people to feel movements. "However, we have to consider an application in the area of the neck here," says Müller-Putz, alluding to future goals. First and foremost, the researchers want to improve the decoding of a movement from visual, intentional and kinesthetic information, thereby detecting errors and uniting all four BCI systems in a "quadruple BCI system".

Source:

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Groundbreaking brain atlas sheds light on neural pathways for movement