Chinese scientists have introduced a novel brain-inspired network model that tackles the high computing resource consumption and lack of interpretability seen in traditional AI models. This approach, developed by the Institute of Automation, Tsinghua University, and Peking University, focuses on “internal complexity” rather than merely expanding neural networks.
Current AI models often rely on increasing the size and depth of neural networks to achieve general intelligence. However, this method is both resource-intensive and challenging to interpret. The new model draws inspiration from the human brain, which operates efficiently with 100 billion neurons and 1,000 trillion synaptic connections while consuming only 20 watts of power.
The internal complexity model mimics the brain’s dynamic interactions among neurons, offering a more sustainable and interpretable alternative to existing methods. It has demonstrated effectiveness in handling complex tasks, providing a promising direction for future AI developments.
This research, published in *Nature Computational Science*, represents a significant advancement in AI, potentially leading to more efficient and intelligent systems that better replicate the brain’s complex functionality.