Current AI models often rely on increasing the size and depth of neural networks to achieve general intelligence. However, this method is both resource-intensive and challenging to interpret. The new model draws inspiration from the human brain, which operates efficiently with 100 billion neurons and 1,000 trillion synaptic connections while consuming only 20 watts of power.
The internal complexity model mimics the brain’s dynamic interactions among neurons, offering a more sustainable and interpretable alternative to existing methods. It has demonstrated effectiveness in handling complex tasks, providing a promising direction for future AI developments.
This research, published in *Nature Computational Science*, represents a significant advancement in AI, potentially leading to more efficient and intelligent systems that better replicate the brain’s complex functionality.