
SILVERSTEIN Priya
- Psychological Science Accelerator, Ashland University, Ashland, United Kingdom
- Social sciences
Recommendations: 0
Reviews: 2
Website
http://www.priyasilverstein.com
Areas of expertise
psychology, metascience, open science
Reviews: 2
23 Jan 2025
STAGE 1

Mapping methodological variation in experience sampling research from design to data analysis: A systematic review
Methodological Variation in the Experience Sampling Methods: Can We Do ESM Better?
Recommended by Thomas Evans based on reviews by Priya Silverstein and 1 anonymous reviewerThe replication crisis/credibility revolution has driven a vast number of changes to our research environment (Korbmacher et al., 2023) including a much needed spotlight on issues surrounding measurement (Flake & Fried, 2020). As general understanding and awareness has increased surrounding the 'garden of forking paths' or 'researcher degrees of freedom' (Simmons et al., 2011), and the various decisions made during the scientific process that could impact the conclusions drawn by the process, so too should our interest in meta-research that tells us more about the methodological processes we follow, and how discretionary decisions may influence the design, analysis and reporting of a project.
Peeters et al. (2025) have proposed a systematic literature review of this nature, mapping the methodological variation in experience sampling methods (ESM) from the design stage all the way to dissemination. It starts this journey by mapping how ESM studies vary e.g., in design, considering a variety of factors like sample size, number of measurements, and sampling scheme. It also evaluates reporting quality, rationales provided, and captures the extent of open science practices adopted. Covering many parts of the research process that get assumed, unreported or otherwise unjustified, the proposed work looks set to springboard an important body of work that can tell us more effectively how to design, implement and report ESM studies.
The Stage 1 submission was reviewed over one round of in-depth review with two reviewers. Based on detailed responses to reviewers’ feedback, the recommender judged that the manuscript met the Stage 1 criteria and therefore awarded in-principle acceptance.
URL to the preregistered Stage 1 protocol: https://osf.io/ztvn3
Level of bias control achieved: Level 1. At least some of the data/evidence that will be used to the answer the research question has been accessed and observed by the authors, including key variables, but the authors certify that they have not yet performed any of their preregistered analyses, and in addition they have taken stringent steps to reduce the risk of bias.
List of eligible PCI-RR-friendly journals:
- Advances in Methods and Practices in Psychological Science *pending editorial consideration of disciplinary fit
- Peer Community Journal
- PeerJ
- Swiss Psychology Open
References
1. Flake, J. K., & Fried, E. I. (2020). Measurement schmeasurement: Questionable measurement practices and how to avoid them. Advances in Methods and Practices in Psychological Science, 3, 456-465. https://doi.org/10.1177/2515245920952393
2. Korbmacher, M., Azevedo, F., Pennington, C. R., Hartmann, H., Pownall, M., Schmidt, K., ... & Evans, T. (2023). The replication crisis has led to positive structural, procedural, and community changes. Communications Psychology, 1, 3. https://doi.org/10.1038/s44271-023-00003-2
3. Peeters, L., Van Den Noortgate, W., Blanchard, M. A., Eisele, G., Kirtley, O., Artner, R., & Lafit, G. (2025). Mapping Methodological Variation in ESM Research from Design to Data Analysis: A Systematic Review. In principle acceptance of Version 3 by Peer Community in Registered Reports. https://osf.io/ztvn3
4. Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359-1366. https://doi.org/10.1177/0956797611417632
02 Jun 2024
STAGE 1

Mapping Cross-Disciplinary Perspectives on Responsible Conduct of Research: A Delphi Study
Capturing Perspectives on Responsible Research Practice: A Delphi Study
Recommended by Charlotte Pennington and Maanasa Raghavan based on reviews by Moin Syed, Veli-Matti Karhulahti, Thomas Evans, Priya Silverstein and Sean GrantThe responsible conduct of research (RCR) is crucial for the health of the research ecosystem: high quality research should lead to more credible findings and increase public trust. However, the dimensions and responsibilities that make up RCR differ across disciplines, who together can learn from one another to ensure rigorous, transparent, and reliable research and foster healthier research culture.
Bridging this gap, in their Stage 1 Registered Report, Field and colleagues (2024) outline their plans for a large-scale Delphi study to evaluate academics' perceived levels of importance of the most crucial elements of RCR and how these align and differ across disciplines. First, they plan to assemble a Delphi panel of RCR experts across multiple disciplines who will evaluate a list of RCR dimensions to suggest any additions. Then, these same panellists will judge each RCR dimension on its importance within their discipline of expertise, with iterative rounds of ratings until stability is reached. In this latter phase, the goal is to probe which items are more broadly appreciated by the sample (i.e., those that are perceived as a universally valuable RCR practice), versus which might be more discipline specific. The findings will present the median importance ratings and categories of response agreement across the entire panel and between different disciplines. Finally, to contextualise these findings, the team will analyse qualitative findings from open-ended text responses with a simple form of thematic analysis. From this, the team will develop a framework, using the identified RCR dimensions, that reflects the needs of the academic community.
By mapping a broader multidisciplinary perspective on RCR, this research will fill the gap between the two extremes that existing conceptualisations of RCR tend to fall under: high-level frameworks designed to be universally applicable across all disciplines (e.g., the Singapore Statement on Research Integrity) and prescriptive guides tailored to the practical instruction of researchers within a specific discipline or field (e.g., RCR training designed for members of a university department). The hope is that this will stimulate a more nuanced understanding and discussion of cross-disciplinary conceptions of RCR.
Five expert reviewers with field expertise assessed the Stage 1 manuscript over two rounds of in-depth review. Based on detailed and informed responses to the reviewer’s comments, the recommenders judged that the manuscript met the Stage 1 criteria and therefore awarded in-principle acceptance (IPA).
URL to the preregistered Stage 1 protocol: https://osf.io/xmnu5
Level of bias control achieved: Level 6. No part of the data or evidence that will be used to answer the research question yet exists and no part will be generated until after IPA.
List of eligible PCI RR-friendly journals:
- Advances in Methods and Practices in Psychological Science
- Collabra: Psychology
- In&Vertebrates
- Meta-Psychology
- Peer Community Journal
- PeerJ
- Studia Psychologica
References
Field, S. M., Thompson, J., van Drimmelen, T., Ferrar, J., Penders, B., de Rijcke, S., & Munafò, M. R. (2024). Mapping Cross-Disciplinary Perspectives on Responsible Conduct of Research: A Delphi Study. In principle acceptance of Version 3 by Peer Community in Registered Reports. https://osf.io/xmnu5