Reinstating Verbal Memories With Virtual Contexts: Myth Or Reality?

PLOS ONE(2019)

引用 10|浏览18
暂无评分
摘要
When learning new information, contextual information about the encoding situation is stored in addition to the focal memory content. Later, these strings of extra information can help retrieve the learned content as demonstrated by experiments where contextual cues from an encoding situation facilitate remembering and improve memory performance when reinstated during retrieval. This context-dependent memory effect has been investigated over the course of several decades and has been demonstrated with many different types of contexts. Based on this, the widely held belief is that context-dependent memory is a strong and robust effect, with transferable substance for everyday learning and potential clinical applications. Here we report the results of a multi-study design investigating the influence of reinstated visual contexts on memory performance. Data from 120 participants were included in three studies comprising a variety of visual cues. We show convincingly that even rich, salient and fully surrounding visual contexts provided by virtual reality are not sufficient to induce effects of context-dependency in a free recall memory task. We also investigated contextual modulation of oscillatory brain activity in order to test the effect of reinstated neural contexts, which failed to evoke a robust effect when re-tested in an internal conceptual replication study. Moreover, a Bayesian sequential statistical analysis revealed moderate to strong evidence against the hypothesis that reinstatement of visual contexts benefits free recall memory tasks indicating that effects are small and may not be suitable for transfer into everyday learning.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要