AI co-pilot boosts noninvasive brain-computer interface by interpreting user intent
Peer-Reviewed Publication
Updates every hour. Last Updated: 1-Sep-2025 16:11 ET (1-Sep-2025 20:11 GMT/UTC)
UCLA engineers have developed a wearable, noninvasive brain-computer interface system that utilizes artificial intelligence as a co-pilot to help infer user intent and complete tasks by moving a robotic arm or a computer cursor. Published in Nature Machine Intelligence, the study shows that the interface demonstrates a new level of performance in noninvasive brain-computer interface, or BCI, systems. This could lead to a range of technologies to help people with limited physical capabilities, such as those with paralysis or neurological conditions, handle and move objects more easily and precisely.