2026-01-28 沖縄科学技術大学院大学

© 瀬良垣香織/OIST
<関連情報>
- https://www.oist.jp/ja/news-center/news/2026/1/28/ai-learns-better-when-it-talks-itself
- https://direct.mit.edu/neco/article-abstract/38/1/28/133750/Working-Memory-and-Self-Directed-Inner-Speech
ワーキングメモリと自己主導型内発話は能動的な推論におけるマルチタスク一般化を強化する Working Memory and Self-Directed Inner Speech Enhance Multitask Generalization in Active Inference
Jeffrey Frederic Queißer,Jun Tani
Neural Computation Published:December 22 2025
DOI:https://doi.org/10.1162/NECO.a.36
Abstract
This simulation study shows how a set of working memory tasks can be acquired simultaneously through interaction between a stacked recurrent neural network (RNN) and multiple working memories. In these tasks, temporal patterns are provided, followed by linguistically specified task goals. Training is performed in a supervised manner by minimizing the free energy, and goal-directed tasks are performed using the active inference (AIF) framework. Our simulation results show that the best task performance is obtained when two working memory modules are used instead of one or none and when self-directed inner speech is incorporated during task execution. Detailed analysis indicates that a temporal hierarchy develops in the stacked RNN module under these optimal conditions. We argue that the model’s capacity for generalization across novel task configurations is supported by the structured interplay between working memory and the generation of self-directed language outputs during task execution. This interplay promotes internal representations that reflect task structure, which in turn support generalization by enabling a functional separation between content encoding and control dynamics within the memory architecture.


