HARTGERINK (THEY/THEM) Chris's profile
avatar

HARTGERINK (THEY/THEM) ChrisORCID_LOGO

  • CEO, Liberate Science GmbH, Berlin, Germany
  • Social sciences

Recommendations:  0

Reviews:  2

Website https://chjh.nl
Areas of expertise
Meta-research, publishing, statistics, meta-analysis, library and information sciences

Reviews:  2

11 Apr 2024
STAGE 1
toto

Does retrieval practice protect memory against stress? A meta-analysis [Stage 1 Registered Report]

Can retrieval practice prevent the negative impact of acute stress on memory performance?

Recommended by based on reviews by Chris Hartgerink (they/them) and Adrien Fillon
There are a number of broad assumptions about memory which have penetrated societal understanding and mostly reflect supporting academic evidence e.g., that acute stress can compromise memory performance (Shields et al., 2017) and that practicing recalling critical information can help retain that knowledge (Moriera et al., 2019). The evidence base is less consistent when evaluating whether retrieval practice can protect against the negative effects of acute stress on memory, despite it being highly important for educators as to whether this specific strategy for supporting memorisation can be evidenced as especially effective under stressful conditions. A rigorous review of this mixed evidence base could provide the basis for developments in memory theory and research practice, with potential for direct educational applications.
 
Meta-analyses can play a critical role in furthering our understanding of complex cognitive mechanisms where the evidence base includes a wide range of methods, factors and effect size estimates. Furthermore, there is a lack of rigorous meta-analyses that prioritise open and reproducible processes (Topor et al., 2022) which help role-model good practice. In the current Registered Report, Mihaylova et al. (2024) have proposed a rigorous meta-analysis to systematically review and synthesise the evidence on the effects of retrieval practice for memory performance under acute stress. The work looks to be especially valuable for a) informing future research directions through a structured risk of bias evaluation, and b) generating theoretical developments through a range of confirmatory moderators (including stressor types, memory strategies, time of delay and task type). The findings of the planned analyses are expected to be of immediate interest to educational and occupational domains where memory recall is a priority.
 
The Stage 1 manuscript was evaluated over two rounds of in-depth review. Based on detailed responses to the reviewers' comments, the recommender judged that the manuscript met the Stage 1 criteria and therefore awarded in-principle acceptance (IPA).
 
URL to the preregistered Stage 1 protocol: https://osf.io/pkrzb
 
Level of bias control achieved: Level 3. At least some data/evidence that will be used to the answer the research question has been previously accessed by the authors (e.g. downloaded or otherwise received), but the authors certify that they have not yet observed ANY part of the data/evidence.
 
List of eligible PCI RR-friendly journals:
 
 
References
 
1. Mihaylova, M., Kliegel, M, & Rothen, N. (2024). Does retrieval practice protect memory against stress? A meta-analysis. In principle acceptance of Version 3 by Peer Community in Registered Reports. https://osf.io/pkrzb
 
2. Moreira, B. F. T., Pinto, T. S. S., Starling, D. S. V., & Jaeger, A. (2019). Retrieval practice in classroom settings: A review of applied research. In Frontiers in Education (Vol. 4, p. 5). Frontiers Media SA. https://doi.org/10.3389/feduc.2019.00005 
 
3. Shields, G. S., Sazma, M. A., McCullough, A. M., & Yonelinas, A. P. (2017). The effects of acute stress on episodic memory: A meta-analysis and integrative review. Psychological Bulletin, 143, 636–675. https://doi.org/10.1037/bul0000100 
 
4. Topor, M. K., Pickering, J. S., Mendes, A. B., Bishop, D., Büttner, F., Elsherif, M. M., ... & Westwood, S. (2022). An integrative framework for planning and conducting Non-Intervention, Reproducible, and Open Systematic Reviews (NIRO-SR). Meta-Psychology. https://osf.io/preprints/metaarxiv/8gu5z
11 Sep 2023
STAGE 1
toto

Finding the right words to evaluate research: An empirical appraisal of eLife’s assessment vocabulary

Understanding the validity of standardised language in research evaluation

Recommended by and based on reviews by Chris Hartgerink (they/them), Veli-Matti Karhulahti, Štěpán Bahník and Ross Mounce
In 2023, the journal eLife ended the practice of making binary accept/reject decisions following peer review, instead sharing peer review reports (for manuscripts that are peer-reviewed) and brief “eLife assessments” representing the consensus opinions of editors and peer reviewers. As part of these assessments, the journal draws language from a "common vocabulary" to linguistically rank the significance of findings and strength of empirical support for the article's conclusions. In particular, the significance of findings is described using an ordinal scale of terms from "landmark" → "fundamental" → "important" → "valuable" → "useful", while the strength of support is ranked across six descending levels from "exceptional" down to "inadequate".
 
In the current study, Hardwicke et al. (2023) question the validity of this taxonomy, noting a range of linguistic ambiguities and counterintuitive characteristics that may undermine the communication of research evaluations to readers. Given the centrality of this common vocabulary to the journal's policy, the authors propose a study to explore whether the language used in the eLife assessments will be interpreted as intended by readers. Using a repeated-measures experimental design, they will tackle three aims: first, to understand the extent to which people share similar interpretations of phrases used to describe scientific research; second, to reveal the extent to which people’s implicit ranking of phrases used to describe scientific research aligns with each other and with the intended ranking; and third, to test whether phrases used to describe scientific research have overlapping interpretations. The proposed study has the potential to make a useful contribution to metascience, as well as being a valuable source of information for other journals potentially interested in following the novel path made by eLife.
 
The Stage 1 manuscript was evaluated over one round of in-depth review. Based on detailed responses to the reviewers' comments, the recommender judged that the manuscript met the Stage 1 criteria and therefore awarded in-principle acceptance (IPA).
 
URL to the preregistered Stage 1 protocol: https://osf.io/mkbtp
 
Level of bias control achieved: Level 6. No part of the data or evidence that will be used to answer the research question yet exists and no part will be generated until after IPA.
 
List of eligible PCI RR-friendly journals:
 
References
 
1. Hardwicke, T. E., Schiavone, S., Clarke, B. & Vazire, S. (2023). Finding the right words to evaluate research: An empirical appraisal of eLife’s assessment vocabulary. In principle acceptance of Version 2 by Peer Community in Registered Reports. https://osf.io/mkbtp
avatar

HARTGERINK (THEY/THEM) ChrisORCID_LOGO

  • CEO, Liberate Science GmbH, Berlin, Germany
  • Social sciences

Recommendations:  0

Reviews:  2

Website https://chjh.nl
Areas of expertise
Meta-research, publishing, statistics, meta-analysis, library and information sciences