Intervention studies are time-consuming and have high dropout rates, so it can take years to collect data from a large enough sample. In 2019 we had submitted what we thought was a tour de force to Nature Human Behaviour with data from over 280 individuals who completed a working memory training study. Each participant conducted 13 experimental sessions in the lab totalling over 3,500 hours of participant time (not counting the countless hours of development and research time to design and administer these studies). There were two independent datasets that replicated the finding that “far-transfer” of working-memory training to a measure of fluid intelligence is mediated by whether “near-transfer” to an untrained working memory task was observed. We were excited by this submission as it had the potential to reconcile substantial controversies in the field of brain training with a simple explanation: if one didn’t learn the main task, transfer to untrained tasks is unlikely to occur.
Upon receiving reviews, we were pleased to have the opportunity to revise and resubmit; however, we were asked to conduct yet another replication of the finding, this time pre-registered, and with an active control group. While we agreed that this would be a valuable addition to the paper, this coincided with the start of the COVID-19 pandemic. So, aside from the years of work required to conduct another study involving thousands of hours of participant testing and training, we had to find a way to conduct our research with participants who were confined to their own homes. For a large-scale three-week intervention study, this was quite a technical and logistical challenge, to say the least. This process proved to be a great learning experience1, allowing us to conduct research remotely to the extent that we are now struggling to get participants (and research assistants) to return to the lab for in-person research. The journal editors were very accommodating and approved extension requests so that almost 2 years later, and with over 250 new participants, our findings replicated, and the paper was re-submitted. Five years after initial data collection, the article is now published, and the disruptions caused by the COVID-19 pandemic appear to be slowing down.
Now that you know the often-untold story of the journey from data collection to publication, let's take a more detailed look at what the article is about.
Previously we alluded to the turbulent path of brain training in the past 20 years, which brings to the fore the possibility of improving core cognitive processes such as inhibitory control or working memory but has also generated controversy about its effectiveness. Training-induced changes in cognitive processing are thought to be driven by neuroplasticity, the brain’s ability to adapt in function and structure in response to experience2. The effectiveness of brain training is measured by examining how performance changes on the trained task and on tasks that were not part of the training program. In other words, researchers look for evidence of transfer beyond the trained domain.
If effective, brain training could have a tremendous positive impact on society, for example, for patient populations who have temporary or permanent deficits in cognitive function and to mitigate or delay age-related cognitive decline. You might guess that there's a "but" coming now. You are right: promising results have been reported, but the inconsistent findings across studies have led to heated debates among scientists. While meta-analyses show evidence of small to moderate positive effects on tasks similar to those used during training (so-called near transfer)3,4, evidence for improvements in tasks substantially different from those trained (so-called far transfer), or even transfer to daily activities, is sparse and not consistently observed5. The controversy is fueled by a multi-million dollar industry that capitalizes on people's need for mentally stimulating activities that could "sharpen" their minds.
We have argued that one of the reasons for these contradictory findings is that individuals likely respond differently to the same training intervention and therefore exhibit different transfer effects6,7. As pointed out by our group in Scientific American and by others8, these individual differences cannot be accounted for by examining average group effects. Although there are many important factors that might be considered, we first asked ourselves the simple question of whether the extent of improvement in near transfer might mediate far transfer.
Our intervention was designed to train working memory, the ability to mentally store and process information. We collected data in our laboratories at UC Irvine and UC Riverside.
The data was analyzed using a mediation model. Because this model was applied to experimental data (random assignment to a training or control group), we were able to make theoretical predictions about the causal mechanisms of transfer to untrained tasks. To assess near transfer, we used an N-back task that consisted of a different set of objects than those presented during training. If one cannot show improvement on an untrained version of the trained task, then it is likely that the person used highly specific strategies during training that are not useful outside of the training context. In this case, we cannot expect transfer to other task types and cognitive domains. Although the premise is simple, the extent of transfer to untrained objects cannot be estimated in most studies because transfer to untrained tasks is often not reported and therefore information about what participants learned is missing.
In two cohorts that were part of a Retrospective study, we found that the extent to which individuals improve on untrained N-back (near transfer) mediates the extent to which they improve on a measure of abstract reasoning (far transfer). We also used an inhibitory control task as a control measure for our analyses.
The results of the Replication study confirmed the original finding that near transfer mediates far transfer on a measure of abstract reasoning. Nonetheless, we observed inconsistent intervention effects on far transfer, meaning that even when these mediators were taken into account, no reliable improvement in abstract reasoning was observed in some cohorts. While this may seem counterintuitive, mediated effects can be statistically significant in the absence of such effects9. The results also showed that near transfer mediated transfer to a working memory composite, which showed a small but significant intervention effect.
Overall, our findings underscore the need to better understand how, for whom, and why transfer effects occur. Understanding these factors will help maximize the effectiveness of brain training interventions, which could inform the development of personalized training approaches. To this end, we have launched a new study that aims to recruit 30,000 volunteers for memory training. Click here to sign up or to learn more about this study.
- Collins, C. L., Pina, A., Carrillo, A., Ghil, E., Smith-Peirce, R. N., Gomez, M., Okolo, P., Chen, Y., Pahor, A., Jaeggi, S. M. & Seitz, A. R. Video-Based Remote Administration of Cognitive Assessments and Interventions: a Comparison with In-Lab Administration. J. Cogn. Enhanc. 1–11 (2022). doi:10.1007/s41465-022-00240-z
- Nguyen, L., Murphy, K. & Andrews, G. Cognitive and neural plasticity in old age: A systematic review of evidence from executive functions cognitive training. Ageing Res Rev 53, 100912 (2019).
- Karbach, J. & Verhaeghen, P. Making working memory work: a meta-analysis of executive-control and working memory training in older adults. Psychol. Sci. 25, 2027–2037 (2014).
- Melby-Lervåg, M., Redick, T. S. & Hulme, C. Working Memory Training Does Not Improve Performance on Measures of Intelligence or Other Measures of “Far Transfer”: Evidence From a Meta-Analytic Review. Perspect. Psychol. Sci. 11, 512–534 (2016).
- Soveri, A., Antfolk, J., Karlsson, L., Salo, B. & Laine, M. Working memory training revisited: A multi-level meta-analysis of n-back training studies. Psychon. Bull. Rev. 24, 1077–1096 (2017).
- Jaeggi, S. M., Buschkuehl, M., Jonides, J. & Shah, P. Short- and long-term benefits of cognitive training. Proc. Natl. Acad. Sci. USA 108, 10081–10086 (2011).
- Jaeggi, S. M., Buschkuehl, M., Shah, P. & Jonides, J. The role of individual differences in cognitive training and transfer. Mem. Cognit. 42, 464–480 (2014).
- Smid, C. R., Karbach, J. & Steinbeis, N. Toward a science of effective cognitive training. Curr. Dir. Psychol. Sci. 096372142095159 (2020). doi:10.1177/0963721420951599
- O’Rourke, H. P. & MacKinnon, D. P. Reasons for testing mediation in the absence of an intervention effect: A research imperative in prevention and intervention research. J Stud Alcohol Drugs 79, 171–181 (2018).
Cover art created by Yvette Chen.