2025-06-26 イェール大学
<関連情報>
- https://news.yale.edu/2025/06/26/attention-scan-how-our-minds-shift-focus-dynamic-settings
- https://psycnet.apa.org/doiLanding?doi=10.1037%2Frev0000572
人間のダイナミックな注意の新しいメカニズムとしての適応的計算。 Adaptive computation as a new mechanism of dynamic human attention.
Belledonne, Mario; Butkus, Eivinas ;Scholl, Brian; J. Yildirim, Ilker
Psychological Review Published:2025
DOI:https://psycnet.apa.org/doi/10.1037/rev0000572
Abstract
A key role for attention is to continually focus visual processing to satisfy our goals. How does this work in computational terms? Here we introduce adaptive computation—a new computational mechanism of human attention that bridges the momentary application of perceptual computations with their impact on decision outcomes. Adaptive computation is a dynamic algorithm that rations perceptual computations across objects on-the-fly, enabled by a novel and general formulation of task relevance. We evaluate adaptive computation in a case study of multiple object tracking (MOT)—a paradigmatic example of selection as a dynamic process, where observers track a set of target objects moving amid visually identical distractors. Adaptive computation explains the attentional dynamics of object selection with unprecedented depth. It not only recapitulates several classic features of MOT (e.g., trial-level tracking accuracy and localization error of targets), but also captures properties that have not previously been measured or modeled—including both the subsecond patterns of attentional deployment between objects, and the resulting sense of subjective effort. Critically, this approach captures such data within a framework that is in-principle domain-general, and, unlike past models, without using any MOT-specific heuristic components. Beyond this case study, we also look to the future, discussing how adaptive computation may apply more generally, providing a new type of mechanistic model for the dynamic operation of many forms of visual attention. (PsycInfo Database Record (c) 2025 APA, all rights reserved)