A non-invasive brain-computer interface that helps control things just by thinking

6 months ago 5
ARTICLE AD BOX
A non-invasive brain-computer interface that helps control things just by thinking

Researchers have shown that a non-invasive, AI-powered brain-computer interface (BCI) can allow individuals to track a moving cursor on a screen simply by thinking about it.

Brain chips, such as those made by Elon Musk's Neuralink and Bill Gates' Synchron, fall into two categories: invasive or minimally invasive (non-invasive). This means that these devices are either implanted directly inside the brain or inside the skull.

In the new study, researchers at Carnegie Mellon University point out that a non-invasive brain-computer interface that works by analyzing brain waves recorded through electroencephalography (EEG) offers several advantages, including increased safety, cost-effectiveness, and the ability to be used by Many patients, as well as the general population.

Deep learning makes non-invasive brain-computer interfaces work like magic, but the problem is that these devices are not as accurate as invasive brain-computer interface devices. It also collects data using external sensors that are not in direct contact with brain tissue, and any disturbance in the user's surroundings could affect its function.

According to Carnegie Mellon University researchers, deep neural networks based on artificial intelligence can solve this problem. They are more advanced than the artificial neural networks used for facial recognition, speech recognition, and various other simple tasks.

A deep neural network can perform more complex tasks, allowing the brain-computer interface to extract accurate results even from complex and large data sets with distortion and noise.

During the study, 28 participants were able to continuously track an on-screen cursor using only their thoughts.

The researchers had non-invasive brain-computer interfaces (BCI) connected to their brains. At the same time, they used electroencephalography (EEG) to record the participants' brain activity.

Electroencephalography (EEG) data was used to train a deep neural network powered by artificial intelligence.

“ This network was able to directly understand what participants intended to do with the cursor that was constantly moving on the screen, just by analyzing data from brain-computer interface (BCI) sensors,” the study authors noted .

The results of the current study suggest that in the future, non-invasive AI-powered brain-computer interfaces (BCIs) could help individuals control external devices without using their hands and muscles.

This could make it easier for humans to interact with technology, allow scientists to study human brain function in great detail, and improve the quality of life for individuals with amputees and disabilities.

The study was published in the journal PNAS Nexus.

Read Entire Article