Statistically learned visual representations support visual working memory
Date
2021
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
University of Delaware
Abstract
Visual statistical learning (VSL) refers to the human ability to unintentionally extract statistical information from our environment. VSL has been proposed to have wide-reaching influence on a number of aspects of cognition. However, direct evidence that statistically learned information is used by other cognitive processes is lacking. While most prior VSL research has focused on factors that influence the learning itself, less is known about how statistically learned information transfers across task contexts, despite claims that depend on VSL generalizing widely. In this dissertation, motivated by commonalities of “chunking” concepts used by the VSL and visual working memory (VWM) literatures, I seek evidence that memory representations formed by VSL could support subsequent VWM performance, by employing the same regularities between tasks. In a series of experiments, I examined the extent to which representations formed during a VSL training task can generalize to a VWM test context. I established that such transfer is possible, but that its likelihood seems to be highly dependent on the similarity between training and test contexts. Limitations of this transfer suggest that previous measures may be overstating the generalizability of VSL. Nevertheless, the ability for VSL to support VWM demonstrates one way in which incidental learning could support cognition.
Description
Keywords
Chunking, Transfer, Visual statistical learning, Visual working memory