AI Self-Talk: Boosting Learning & Multitasking

by Priyanka Patel

AI Learns too ‘Talk to Itself’ for Enhanced Multitasking and Problem-Solving

A new study demonstrates that equipping artificial intelligence with a form of “inner speech” and a robust working memory substantially boosts its ability to learn, adapt, and handle complex tasks – mirroring cognitive processes found in humans.

“We’re finding that the way we structure training data – essentially, teaching our system to ‘talk to itself’ – is crucial. We show that learning is shaped not only by the architecture of our AI systems, but by the interaction dynamics embedded within our training procedures,”

The researchers combined self-directed “mumbling” – essentially prompting the AI to reiterate details – with a novel working memory architecture to achieve these gains. This approach improved the AI models’ ability to learn new skills, adapt to unforeseen situations, and effectively multitask.

Brain-Inspired AI: Moving Beyond Rote Learning

The OIST team’s work centers on content agnostic information processing – the ability of AI to apply learned methods to situations beyond those explicitly encountered during training. This is a significant hurdle for current AI systems.

“Rapid task switching and solving unfamiliar problems is something we humans do easily every day.but for AI, it’s much more challenging,” Dr. Queißer noted. “That’s why we take an interdisciplinary approach, blending developmental neuroscience and psychology with machine learning and robotics to find new ways to think about learning and inform the future of AI.”

The initial focus was on working memory, the cognitive system responsible for temporarily holding and manipulating information. From remembering instructions to performing mental calculations, working memory is crucial for a wide range of cognitive functions. The researchers simulated tasks of varying difficulty and discovered that AI systems with multiple “working memory slots” – temporary storage containers for information – demonstrated superior generalization capabilities, notably when tasked with reversing sequences or recreating patterns.

The Power of ‘Self-Mumbling’

adding a “self-mumbling” component – instructing the system to internally repeat information a specified number of times – further enhanced performance,especially when the AI was required to multitask or navigate multi-step processes.

“Our combined system is particularly exciting because it can work with sparse data instead of the extensive data sets usually required to train such models for generalization,” Dr. Queißer emphasized. “It provides a complementary, lightweight alternative.” This is a crucial advantage, as acquiring large, labeled datasets for AI training can be expensive and time-consuming.

Towards More Robust and Adaptive AI

The team’s future research aims to introduce greater complexity and realism into the learning surroundings. “In the real world,we’re making decisions and solving problems in complex,noisy,dynamic environments,” Dr. Queißer stated. “To better mirror human developmental learning, we need to account for these external factors.”

This research is part of a broader effort to understand the neural basis of human learning. By exploring phenomena like inner speech and deciphering the underlying mechanisms, scientists hope to gain basic insights into human biology and behavior.

“By exploring phenomena like inner speech, and understanding the mechanisms of such processes, we gain fundamental new insights into human biology and behavior,” Dr. Queißer concluded. “We can also apply this knowledge, for example in developing household or agricultural robots which can function in our complex, dynamic worlds.”

More information:
Jeffrey Frederic Queißer et al, working Memory and Self-Directed Inner Speech Enhance Multitask Generalization in Active Inference, Neural Computation (2025).DOI: 10.1162/neco.a.36

Provided by
okinawa Institute of science and technology.

You may also like

Leave a Comment