Scientists Establish Brain-Inspired Network Model to Bridge AI, Neuroscience
One of the key objectives in the current development of the artificial intelligence (AI) is to build more general-purpose AI, enabling models to possess a broader and more general cognitive ability, the journal Nature Computational Science reported.
The current popular approach used by large models is to build larger, deeper, and wider neural networks based on the law of scale, which can be referred to as a method for achieving general intelligence based on "external complexity," said Li Guoqi, a researcher from the Institute of Automation.
However, this approach faces challenges such as unsustainable consumption of computing resources and energy, as well as a lack of interpretability.
On the other hand, a human brain has about 100 billion neurons and nearly 1,000 trillion synaptic connections, with each neuron having a rich and diverse internal structure. But the power consumption of a human brain is only around 20 watts.
Inspired by the dynamics of brain neurons, the scientists, from the Institute of Automation and other research institutions such as Tsinghua University and Peking University, used an "internal complexity" approach to achieve general intelligence.
Their experiments verified the effectiveness and reliability of the internal complexity model in handling complex tasks, providing new methods and theoretical support for integrating the dynamic characteristics of neuroscience into AI, and also offering feasible solutions for optimizing and enhancing the practical performance of AI models.
4155/v