
Brain Interface Speaks Your Thoughts In Near Real-time 7
Longtime Slashdot reader backslashdot writes: Commentary, video, and a publication in this week's Nature Neuroscience herald a significant advance in brain-computer interface (BCI) technology, enabling speech by decoding electrical activity in the brain's sensorimotor cortex in real-time. Researchers from UC Berkeley and UCSF employed deep learning recurrent neural network transducer models to decode neural signals in 80-millisecond intervals, generating fluent, intelligible speech tailored to each participant's pre-injury voice. Unlike earlier methods that synthesized speech only after a full sentence was completed, this system can detect and vocalize words within just three seconds. It is accomplished via a 253-electrode array chip implant on the brain. Code and the dataset to replicate the main findings of this study are available in the Chang Lab's public GitHub repository.