Super-Turing AI: Artificial intelligence has already proven its skills in handling complex tasks that would take humans a lifetime to complete, from analysing large amounts of data to performing complex calculations
But do we know the actual cost of using AI? AI systems require a large amount of energy consumption to function.
Current AI models rely on expansive data centres, consuming enormous amounts of electricity to support their computing power. The human brain, an incredibly efficient organ, achieves similar feats of computation with a minuscule amount of energy, only 20 watts.
Energy efficient AI
A team of researchers, including Dr. Suin Yi from Texas A&M University, have made a breakthrough that could help close this energy gap, creating a future with more sustainable AI systems.
The new approach, known as “Super-Turing AI,” mimics the brain’s processing methods to make AI more energy-efficient without sacrificing performance. This development could have massive implications for the future of AI, especially as companies race to build larger and more powerful models.
Current AI models, including those used for large language systems like OpenAI and ChatGPT, rely on massive computing power and huge data centres. Training and memory, two key components of AI systems, are often handled separately in current hardware. This separation requires data to be constantly moved between different hardware components, leading to inefficiencies that demand significant energy resources.
Super-Turing AI revolutionises this by closing the gap between training and memory. Rather than separating these two processes, the new AI integrates them into one cohesive system, reducing the need for large amounts of data migration. The concept is inspired by the way the human brain functions, where memory and learning are inherently connected through neurons and synaptic connections.
Reducing energy consumption in AI.
This research could significantly reduce energy consumption in AI systems. While modern data centers consume power in the range of gigawatts, the human brain operates on 20 watts, a big difference in efficiency. Super-Turing AI aims to bring this same level of energy efficiency to AI by integrating learning and memory processes, similar to how the brain operates.
In addition to reducing power consumption, the new system minimises reliance on backpropagation, a computationally expensive method used to adjust neural networks during training. Backpropagation has been a cornerstone of AI development, but it is not biologically plausible and requires a lot of energy. Instead, Super-Turing AI uses biologically inspired mechanisms like Hebbian learning and spike-timing-dependent plasticity, which more closely resemble the way neurons in the brain strengthen their connections based on activity patterns.
The future of AI
The researchers shows the power of this new AI approach with a drone that learned to navigate a complex environment without prior training. The drone adapted to its surroundings on the fly, learning in real-time while consuming less energy than traditional AI systems.
This breakthrough is important, as the demand for AI continues to grow, and the environmental and economic costs of sustaining large AI systems have become more apparent. Data centers’ enormous energy consumption not only adds to operational costs but also contributes significantly to carbon emissions, raising concerns about the sustainability of current AI technologies.
As AI continues to evolve, the need for sustainable computing methods is becoming more important. Super-Turing AI represents a promising solution to this challenge. The system’s ability to reduce power consumption while maintaining or even improving AI performance could make it a key player in AI’s future, where hardware innovation will be just as important as advancements in software.