Wednesday, August 20, 2025

A brain-computer interface that reads inner thoughts.

Inampudi does a description of work by Kunz et al., who isolated signals from a brain implant so people with movement disorders could voice thoughts without trying to speak. Here are the highlights and summary of the work: 

Highlights

•Attempted, inner, and perceived speech have a shared representation in motor cortex
•An inner-speech BCI decodes general sentences with improved user experience
•Aspects of private inner speech can be decoded during cognitive tasks like counting
•High-fidelity solutions can prevent a speech BCI from decoding private inner speech

Summary

Speech brain-computer interfaces (BCIs) show promise in restoring communication to people with paralysis but have also prompted discussions regarding their potential to decode private inner speech. Separately, inner speech may be a way to bypass the current approach of requiring speech BCI users to physically attempt speech, which is fatiguing and can slow communication. Using multi-unit recordings from four participants, we found that inner speech is robustly represented in the motor cortex and that imagined sentences can be decoded in real time. The representation of inner speech was highly correlated with attempted speech, though we also identified a neural “motor-intent” dimension that differentiates the two. We investigated the possibility of decoding private inner speech and found that some aspects of free-form inner speech could be decoded during sequence recall and counting tasks. Finally, we demonstrate high-fidelity strategies that prevent speech BCIs from unintentionally decoding private inner speech.

 

No comments:

Post a Comment