Brain-computer interface predicts patient’s thoughts

Brain-computer interface predicts patient’s thoughts
Brain-computer interface predicts patient’s thoughts

New scientific research presented this week Neuroscience Society 2022 California Institute of Technology (Caltech) conference shows that a brain-machine interface (BMI), also known as a brain-computer interface (BCI), can predict the internal monologue of a person with a high degree of precision.

Proof of concept for a successful internal speech IMC

Brain-machine interfaces allow people unable to speak due to neurological conditions such as amyotrophic lateral sclerosis (ALS) to control external devices to communicate, use smartphones, type emails, shop online and more. other functions in order to live more independently.

“This work represents the first proof-of-concept for a high-performance internal voice IMC,” the Caltech researchers wrote in their latest study.

Scientists hypothesized that different regions of the brain would modulate during vocalized speech compared to internal speech. Specifically, the researchers were testing their theory that for vocalized speech, the supramarginal gyrus (SMG) in the posterior parietal cortex (PPC) and primary somatosensory cortex (S1) activity would modulate and that during internal speech, only l SMG activity would modulate.

The study participant was a tetraplegic (quadriplegic) with a previous spinal cord injury. The participant was implanted with a 96-channel multi-electrode array, the Neuroport Array from Blackrock Microsystems, in the supramarginal gyrus (SMG) and left ventral premotor cortex (PMv) areas, along with two 48-channel microelectrode arrays in the primary somatosensory cortex (S1).

Caltech researchers chose to use an invasive brain-machine interface in an effort to achieve favorable signal-to-noise ratio and resolution instead of non-invasive brain recording technologies such as magnetoencephalography (MEG), magnetic imaging (fMRI) or electroencephalography (EEG).

The participant’s brain activity was recorded by the implanted arrays while thinking or speaking internally six words and two pseudo-words. The researchers characterized the four linguistic processes of vocalized speech production, word reading, auditory comprehension and internal speech at the neural level. They observed that internal speech is highly decodable in the supramarginal gyrus.

“In this work, we demonstrated a robust decoder for internal and vocalized speech, capturing single neuron activity from the supramarginal gyrus,” the Caltech researchers wrote. “A chronically implanted quadriplegic participant with speech ability was able to use an online, closed-loop internal speech BMI to achieve classification accuracy of up to 91% with an eight-word vocabulary.”

With this demonstrated proof of concept, the researchers believe that the supramarginal gyrus brain region has the potential to represent an even larger internal vocabulary.

“By directly building models on internal speech, our findings may translate to people who cannot vocalize or who are completely locked in,” the researchers concluded.

Copyright © 2022 Cami Rosso All rights reserved.

. interface braincomputer predicts the thoughts patient

. Braincomputer interface predicts patients thoughts

NEXT Scammed customers who fought against online carpentry companies