Recommendation

Can retrieval practice prevent the negative impact of acute stress on memory performance?

ORCID_LOGO based on reviews by Chris Hartgerink (they/them) and Adrien Fillon
A recommendation of:

Does retrieval practice protect memory against stress? A meta-analysis [Stage 1 Registered Report]

Abstract

EN
AR
ES
FR
HI
JA
PT
RU
ZH-CN
Submission: posted 16 February 2023
Recommendation: posted 03 April 2024, validated 10 April 2024
Cite this recommendation as:
Evans, T. (2024) Can retrieval practice prevent the negative impact of acute stress on memory performance?. Peer Community in Registered Reports, . https://rr.peercommunityin.org/articles/rec?id=406

Recommendation

There are a number of broad assumptions about memory which have penetrated societal understanding and mostly reflect supporting academic evidence e.g., that acute stress can compromise memory performance (Shields et al., 2017) and that practicing recalling critical information can help retain that knowledge (Moriera et al., 2019). The evidence base is less consistent when evaluating whether retrieval practice can protect against the negative effects of acute stress on memory, despite it being highly important for educators as to whether this specific strategy for supporting memorisation can be evidenced as especially effective under stressful conditions. A rigorous review of this mixed evidence base could provide the basis for developments in memory theory and research practice, with potential for direct educational applications.
 
Meta-analyses can play a critical role in furthering our understanding of complex cognitive mechanisms where the evidence base includes a wide range of methods, factors and effect size estimates. Furthermore, there is a lack of rigorous meta-analyses that prioritise open and reproducible processes (Topor et al., 2022) which help role-model good practice. In the current Registered Report, Mihaylova et al. (2024) have proposed a rigorous meta-analysis to systematically review and synthesise the evidence on the effects of retrieval practice for memory performance under acute stress. The work looks to be especially valuable for a) informing future research directions through a structured risk of bias evaluation, and b) generating theoretical developments through a range of confirmatory moderators (including stressor types, memory strategies, time of delay and task type). The findings of the planned analyses are expected to be of immediate interest to educational and occupational domains where memory recall is a priority.
 
The Stage 1 manuscript was evaluated over two rounds of in-depth review. Based on detailed responses to the reviewers' comments, the recommender judged that the manuscript met the Stage 1 criteria and therefore awarded in-principle acceptance (IPA).
 
URL to the preregistered Stage 1 protocol: https://osf.io/pkrzb
 
Level of bias control achieved: Level 3. At least some data/evidence that will be used to the answer the research question has been previously accessed by the authors (e.g. downloaded or otherwise received), but the authors certify that they have not yet observed ANY part of the data/evidence.
 
List of eligible PCI RR-friendly journals:
 
 
References
 
1. Mihaylova, M., Kliegel, M, & Rothen, N. (2024). Does retrieval practice protect memory against stress? A meta-analysis. In principle acceptance of Version 3 by Peer Community in Registered Reports. https://osf.io/pkrzb
 
2. Moreira, B. F. T., Pinto, T. S. S., Starling, D. S. V., & Jaeger, A. (2019). Retrieval practice in classroom settings: A review of applied research. In Frontiers in Education (Vol. 4, p. 5). Frontiers Media SA. https://doi.org/10.3389/feduc.2019.00005 
 
3. Shields, G. S., Sazma, M. A., McCullough, A. M., & Yonelinas, A. P. (2017). The effects of acute stress on episodic memory: A meta-analysis and integrative review. Psychological Bulletin, 143, 636–675. https://doi.org/10.1037/bul0000100 
 
4. Topor, M. K., Pickering, J. S., Mendes, A. B., Bishop, D., Büttner, F., Elsherif, M. M., ... & Westwood, S. (2022). An integrative framework for planning and conducting Non-Intervention, Reproducible, and Open Systematic Reviews (NIRO-SR). Meta-Psychology. https://osf.io/preprints/metaarxiv/8gu5z
Conflict of interest:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

Evaluation round #2

DOI or URL of the report: https://osf.io/6jhbr

Version of the report: 1

Author's Reply, 24 Mar 2024

Download author's reply Download tracked changes file

Dear Thomas Evans, 

Thank you very much for your thoughtful consideration of our manuscript and for providing such constructive feedback. We sincerely appreciate the time and effort you and the other reviewers dedicated to evaluating our work during both rounds of review.

We are delighted to hear that the revisions were effective in addressing the concerns raised by the reviewers and that you will be recommending the manuscript. We thank you for the opportunity to clear up the minor edits suggested by the second reviewer. We have also taken this opportunity to reflect further on the issue of sensitivity to heterogeneity raised by the first reviewer. We agree that preregistration of our approach at this stage is valuable, and we have amended the manuscript taking this into consideration. Please find our changes to each point in the attached and in the manuscript in color teal. 

Once again, we sincerely appreciate your support and encouragement. We are thrilled by the prospect of your recommendation for our meta-analysis, and we look forward to the next steps! 

Best regards, 

Mariela Mihaylova for all authors 

Decision by ORCID_LOGO, posted 20 Mar 2024, validated 20 Mar 2024

Dear all,

Thank-you so much for providing a revised manuscript, paying such close attention to the feedback the excellent reviewers provided. As you may gather from the reviewer feedback, the revisions were effective in mitigating nearly all of the concerns raised and we are now at a state where I can make an executive decision on this manuscript. I would like to 'recommend' this manuscript but I would first like to offer you the opportunity to clear up the minor edits suggested by the second reviewer, and to reflect further on the issue of sensitivity to heterogeneity. You may choose to address the latter in the Stage 2 manuscript if you believe that is the best course of action, however I encourage you to consider whether preregistration of action at this stage could be of value beforehand. I leave this with you to decide the best course of action; I will not be sending out your revisions for further review and I will thoroughly look forward to writing my 'recommendation' upon submission.

Take care,

Tom

Dr Thomas Rhys Evans

Reviewed by ORCID_LOGO, 18 Mar 2024

Thank you for the invitation to re-review “Does retrieval practice protect memory against stress? A meta-analysis [Stage 1 Registered Report]”. My express philosophy for re-reviews is that I look at my comments from the previous review, and see whether they are addressed (as is an identified best practice; 10.1101/2022.12.20.521261).

Overall, the authors' responses are satisfactory and leave little more to be desired. It is clear they took the comments seriously and made good faith efforts in revising the stage 1 report. 

My only remaining comment, for a future version, is regarding the publication bias tests. As indicated in my original review, the previously proposed methods are also sensitive to heterogeneity:

> "There seems to be an oversight here, because the included publication bias tests are also sensitive to heterogeneity. I include references below from which that follows"

The authors did not alter their methods substantively from what I gather, as the original publication bias section remains largely unchanged, save a few additions. I would be remiss to not point this out in an otherwise fantastic revision. I strongly encourage the authors to revisit this as previously indicated - at the very least further discuss this in detail in the limitations if there is heterogeneity in the actual data.

All in all, I am grateful to see all points (minus 1) addressed and can recommend the manuscript to move forward. Thank you!

Reviewed by ORCID_LOGO, 23 Feb 2024

This is the second round of review regarding the meta-analysis “Does retrieval practice protect memory against stress?”

I will only provide some proofreading advice, since the authors resolved all of the comments I raised.

page 9 Retrival Practice main effect, the m and e of main effect could be capitalized.

Page 9 first section, all sentences begin by “this” (but also last sentence page 8). Reading could be improved by a rephrasing.

Page 14: 

The date last searched was _____. At Stage 2, we reran the searches at least twice to ensure all literature was up-to-date. The date last searched was ___. The outcome was a total of YY prospective articles. After deletion of doubles, we had a total of XX articles (Figure 1).

I don’t understand this sentence. You should conduct the search between stage 1 and stage 2 so why explicitly say that you will conduct the search at stage 2?

Also I think it is better to say duplicates than doubles. You also began two following sentences by “after” and could find a synonym.

This enabled us to ensure full coverage : you can delete the “enabled us to”

I may propose you to rephrase “study leader” to “main contributor” in the text.

Page 24 “we first illustrated” seems odd. You “display a funnel plot to check the existence of a publication bias.” Illustrated a publication biais would more mean that you draw a picture of a publication bias.

Besides these minors fix, I don’t see any problems and would be happy to see the results of the meta-analysis. I wish the best for the authors in this process.


Evaluation round #1

DOI or URL of the report: https://osf.io/e4fhj

Version of the report: 1

Author's Reply, 10 Feb 2024

Dear Thomas Evans,  

We thank you very much for the positive feedback on our meta-analysis and for the opportunity to revise and resubmit our manuscript. We would also like to immensely thank the reviewers and yourself for the time and effort you have all put into reviewing the manuscript at Stage 1. The comments received have been incredibly helpful and insightful and, we believe, have greatly improved the manuscript.  

We are very happy to resubmit our revised registered report today, February 10, 2024. Please find the revised draft attached on the PCI RR platform, along with our detailed responses to the reviewer feedback. We hope all changes have been implemented to yours and the reviewers’ satisfaction. We remain at your disposal for any further clarification or changes.  

We look forward to hearing back from you soon with a decision and next steps! 

Best wishes, 
Mariela Mihaylova (for all authors) 

Decision by ORCID_LOGO, posted 14 Jun 2023, validated 14 Jun 2023

Dear authors,

Thanks for submitting your work to PCI:RR for review, I have sincerely enjoyed reading your work and reflecting upon the feedback provided by the kind reviewers. Firstly, I’d like to acknowledge that this is not my primary area of expertise, but acting as a recommender for your work I hope I can provide methodological insight, support and guidance throughout the process. Given that it is not my field, I thought the subject content was clearly presented. Secondly, I’d like to apologize for the delay in getting back to you on this decision – we had a reviewer who started a review but didn’t complete it and in waiting for a full review there was slip in timings. If they do get back to me I’ll forward you the feedback, but in the meantime, we will get stuck in!

I believe the core focus of the work is clear and has been communicated clearly for the non-expert. The proposed study design is clearly established with some good use of open tools and a high level of detail in most areas. The potential impact of the work is strong so I see real value in a meta-analysis in this area. Whilst the manuscript has been well-prepared, there are a number of areas of repetition, and sometimes reading flow is interrupted by frequent use of acronyms. 

The reviewers have done a good job at systematically assessing the work and have provided many comments, where there is lots of useful suggestions which could strengthen the work, some of which I too noted. One additional suggestion I would add is to remove any ‘exploratory analyses’ (p27) to make the difference between preregistered and not clearer. As such, I would like to offer you the opportunity to revise your manuscript in-line with the reviewer feedback to further concentrate the clarity and contribution of the work proposed. Please provide a response to reviewer document where you respond to each comment, and please make any subsequent changes to the manuscript through tracked changes to help speed up the next stage of the process.

I do hope this feedback is of benefit to you, and I sincerely look forward to reading the revised version of your proposal.

Stay safe and take care,

Dr Thomas Rhys Evans

Reviewed by ORCID_LOGO, 06 Mar 2023

Thank you for the kind invitation to review “Does retrieval practice protect memory against stress? A meta-analysis [Stage 1 Registered Report]”. It is my first time reviewing for PCI RR, so please let me know if this fits your expectations.

My philosophy for reviews is:

  • To help build up the authors, not to tear them down.
  • Provide honest feedback that the authors can decide what to do with.

Overall, I think this meta-analysis is set up in a meticulous way, and provides a great basis for a study with important findings regardless of outcomes. My review highlights areas that went well, and those that could still improve.

1. What went well

  1. I am not a memory expert, and the introduction was extremely well written. It really helped set the stage for me - thank you! 🙌
  2. Your predictions are clearly articulated and solid. Makes me very happy to see good predictions 😊
  3. The commitment to documenting all conversions and coding decisions is really helpful. When I did this it taught me a lot and served as a helpful reference, so I was pleased to see it here too.
  4. I am very glad to see risk of bias judgments included - thank you!

2. What could have gone better

  1. The OSF project has not been registered and can be amended at any time (as of right now). Preregistration is not complete without explicit registration (until it is assigned a DOI is a decent heuristic here). Could you explain what I missed here?
  2. Database search
    1. You mention you use MeSH terms, but these are for medical subject headings. I do not understand which would apply here. It would also help to explain to the reader what these are as these are not necessarily part of the common language. Could you maybe add a bit more of an explanation here?
    2. Could you explicitly mention the databases you use for what you call the grey literature search?
    3. Table 1 indicates “Variation of the above” for JSTOR and OSF preprints. If at all possible, could you be explicit about the search terms at this stage (or say why they can’t be explicated as such)?
  3. Results
    1. It is confusing that results are included already. I am going to assume for now that this is just a placeholder, but can the authors confirm this? The manuscript does not say whether these are simulated data so it is confusing while reviewing.
    2. If I understand correctly, in the section “Statistical Power”, you seem to be hinting at post-hoc power, which is not a fruitful measure. Could you expand on whether this understanding is correct? Post-hoc power means that you’ll have high power if your meta-analysis finds an effect, and low power if there is none. The only statistical power you can calculate is a priori.
    3. Moderator analyses
      1. Stressor type - could you please add the statistics of the moderation effect results into the text (or where to find that information)?
      2. Same issue for the other learning strategies - mentioned results are best to be included in either the text directly or in the supporting materials (as you do so well for the rest!).
    4. You mention several tests for publication bias, and subsequently eliminate others because they wouldn’t work well in heterogeneous settings (e.g., p-uniform). There seems to be an oversight here, because the included publication bias tests are also sensitive to heterogeneity. I include references below from which that follows, primarily because these methods test/correct for small-study effects which can also be caused by a relation between heterogeneity and the size of the studies.
      1. Egger’s test: Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315(7109), 629–634.https://doi.org/10.1136/bmj.315.7109.629
      2. Trim + Fill: Terrin, N., Schmid, C. H., Lau, J., & Olkin, I. (2003). Adjusting for publication bias in the presence of heterogeneity. Statistics in Medicine, 22(13), 2113–2126. https://doi.org/10.1002/sim.1461
      3. PET-PEESE: Stanley, T. D. (2017). Limitations of PET-PEESE and other meta-analysis methods. Social Psychological and Personality Science, 8(5), 581–591. https://doi.org/10.1177/1948550617693062
    5. A very minor point: It would help if you could label the results in relation to the predictions you made, to clarify which result relates to which (H1, H2, H3).

3. What to watch out for

  1. You use the PsycNET, PubMed (Central?) and JSTOR databases. Did you look into any other databases? From my own experience with meta-analyses in psychology, you might risk losing coverage by restricting to these databases only. I understand that many databases are paywalled (e.g., Scopus, Web of Science) and it may not always be a willing choice, so any additional information on the choice would be helpful.
  2. You call preprints grey literature; according to my understanding grey literature are e.g., blogs. Preprints are still part of the scholarly record. This is of course an ambiguous concept so it is up for the authors to decide what to do.
  3. You choose random effects for the main effects, but fixed effects for the moderator analysis. Could you expand as to why? This seems rather arbitrary considering you just argued for including a distribution of effects a few paragraphs before (specifically referring to p25, first paragraph).

Summary

In my review, I found this manuscript to be succinct, to the point, and clear. I think there are no fundamental flaws or things that cannot be resolved by the authors. Thank you for doing this important work!

Reviewed by ORCID_LOGO, 09 May 2023

User comments

No user comments yet