Groundbreaking Portable AI System Can Turn Thoughts into Text
The portable AI system, produced by the GrapheneX-UTS Human-centric Artificial Intelligence Centre at the University of Technology Sydney (UTS), could aid communication for people who are unable to speak due to illness or injury, including stroke or paralysis, the Innovation News Network reported.
It could also enable seamless communication between humans and machines, such as operating a bionic arm or robot.
The study has been selected as the spotlight paper at this year’s NeurIPS conference, a top-tier meeting that showcases world-leading research on Artificial Intelligence and Machine Learning.
In the study, participants silently read text passages while wearing the portable AI system – a cap recording electrical brain activity through their scalp using an electroencephalogram (EEG).
The EEG wave is segmented into distinct units that capture specific characteristics and patterns from the human brain. This is done by an AI model called DeWave developed by the researchers. DeWave translates EEG signals into words and sentences by learning from large quantities of EEG data.
“This research represents a pioneering effort in translating raw EEG waves directly into language, marking a significant breakthrough in the field,” explained Professor CT Lin, Director of the GrapheneX-UTS HAI Centre and research leader.
“Our portable AI system is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding.
“Integrating Large Language Models is also opening new frontiers in neuroscience and Artificial Intelligence.”
Unlike the new portable AI system, previous technology to translate brain signals to language has required surgery to implant electrodes in the brain.
These include Elon Musk’s Neuralink, or scanning in an MRI machine, which is large, expensive, and difficult to use daily.
These methods also struggle to transform brain signals into word-level segments without additional aids such as eye-tracking, which restricts the practical application of these systems.
The new portable AI system can be used with or without eye-tracking.
The UTS research was carried out with 29 participants. This means it is likely to be more robust and adaptable than previous decoding technology that has only been tested on one or two individuals because EEG waves differ between individuals.
Using EEG signals received through a cap rather than from electrodes implanted in the brain means that the signal is noisier. However, the study reported state-of-the-art performance in terms of EEG translation, surpassing previous benchmarks.
Yiqun Duan, a first author of the study, said: “The portable AI system is more adept at matching verbs than nouns. However, when it comes to nouns, we saw a tendency towards synonymous pairs rather than precise translations, such as ‘the man’ instead of ‘the author’.”
He added: “We think this is because semantically similar words might produce similar brain wave patterns when the brain processes these words.
“Despite the challenges, our model yields meaningful results, aligning keywords and forming similar sentence structures.”
The translation accuracy score is currently around 40% on BLEU-1. The BLEU score is a number between zero and one that measures the similarity of the machine-translated text to a set of high-quality reference translations.
The researchers hope to see this improve to a level that is comparable to traditional language translation or speech recognition programmes, which is closer to 90%.
4155/v