Close printable page
Recommendation

Understanding how applied researchers address open scholarship, feedback and climate change in their work

ORCID_LOGO based on reviews by Crystal Steltenpohl, Lisa Hof and Jay Patel
A recommendation of:

Open Scholarship and Feedback in Applied Research/Understanding the Role of Climate Change in Applied Research: A Qualitative Registered Report

Abstract

EN
AR
ES
FR
HI
JA
PT
RU
ZH-CN
Submission: posted 28 March 2024
Recommendation: posted 15 October 2024, validated 16 October 2024
Cite this recommendation as:
Field, S. (2024) Understanding how applied researchers address open scholarship, feedback and climate change in their work. Peer Community in Registered Reports, . https://rr.peercommunityin.org/PCIRegisteredReports/articles/rec?id=752

Recommendation

This recommendation concerns the plan of two studies that are intended to be conducted simultaneously, using the same data collection approach, and to result in two manuscripts that will be submitted for assessment at Stage 2. The Stage 1 manuscript containing these protocols was submitted via the programmatic track.
 
Protocol 1 concerns “Open Scholarship and Feedback in Applied Research: A Qualitative Registered Report”. With this study, the authors aim to explore how applied researchers integrate feedback processes into their work, in relation to transparency and rigor in particular. They will investigate whether their sample are aware of and use feedback mechanisms from the open science movement, such as registered reports, which makes this study nicely metascientific. Through interviews with 50 applied researchers from various fields, the study will examine current feedback practices. The authors intend to use the findings of this first study to inform recommendations on how open science practices can be incorporated into research workflows.
 
Protocol 2 concerns “Understanding the Role of Climate Change in Applied Research: A Qualitative Registered Report”. This study aims to explore how applied researchers address climate change in their work, including the ways their practices are influenced by and respond to climate challenges. It addresses how their approaches may evolve, and they plan to look into the barriers and opportunities climate change presents in practice. Interviews with 50 applied researchers will be analysed to help understand these dynamics. The authors aim to provide recommendations to help applied researchers and their employers adjust their priorities to align with the urgency of climate action. One reviewer did not comment on this second protocol, as the content was outside of their own research area. Although I would have found a reviewer who specializes in this area directly ideally, I found I could still rely on the other two reviewers and my own knowledge to assess this protocol.
 
General comments: As I mentioned in my initial assessment text, these studies were well planned from the get-go and the protocol nicely articulated those plans. The use of different colour highlighting clearly helped the reviewers target different elements of the protocol and give direct feedback on specific parts. It also helped prevent me from getting lost in all the details! Well done, once again, to the authors for making the distinction between the two studies so clear. I was also pleased at how well they balanced the information between the two protocols – this made it easier to see if there were deficiencies in either one somehow. Finally, I loved that reflexivity was considered by the authors. One suggestion by me is that the authors might consider providing a collective positionality statement to go with the trainees’ reflexivity statements (if this is already in the plan and I missed it, please forgive the oversight!) in the final studies, even if as part of an appendix. This is because the open science movement and climate change can both be controversial, and because of the nature of the qualitative approach I would like to understand a little of the stance the group takes towards these issues collectively if the authors think it’s appropriate. I understand that with a big group that might be difficult or impossible, but if it is possible I would like to see it. I would also like to see initials used in the manuscripts to indicate who was responsible for what analysis elements where possible. This allows for accountability and to attribute interpretation to specific individuals involved in the data analysis. Alternatively, individuals can be attributed in a statement at the end of each manuscript to serve the same purpose and be less awkward in the text. If this won't work for some reason, please motivate this decision. 
 
The three reviewers that took the time to go through the reports nevertheless had useful comments, most of which would have contributed to strengthening the plan and minimizing problematic bias later on. The authors took these comments seriously, and thoughtfully (cheerfully even) responded to each. In my estimation, each of the suggestions of the reviewers were satisfied by the authors’ response to reviews letter. Other than my earlier comment about the positionality statement, I have no further comments for the Stage 1 protocol, and I wish the authors all the best with running the studies and writing up Stage 2 for each.
 
URL to the preregistered Stage 1 protocol: https://osf.io/jdh32
 
Level of bias control achieved: Level 6. No part of the data or evidence that will be used to answer the research question yet exists and no part will be generated until after IPA.
 
List of eligible PCI RR-friendly journals:
 
 
References
 
Evans, T. R. et al. (2024). Open Scholarship and Feedback in Applied Research/Understanding the Role of Climate Change in Applied Research: A Qualitative Registered Report. In principle acceptance of Version 2 by Peer Community in Registered Reports. https://osf.io/jdh32
Conflict of interest:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

Reviews

Evaluation round #1

DOI or URL of the report: https://osf.io/preprints/psyarxiv/p8hb4

Version of the report: 1

Author's Reply, 23 Sep 2024

Decision by ORCID_LOGO, posted 04 Sep 2024, validated 04 Sep 2024

Dear authors, 

First, I apologise for how long this has taken to get through even the initial review stage. Thank you for your patience. After inviting no fewer than 30 reviewers, we finally have 3 thoughtful reviews for you to use to refine your plan for the two components of this programmatic RR. As you will see, each reviewer has a different approach to the content of the RR(s), and the comments vary in terms of their specificity and scope, though most comments are light touch and all reviewers are positive about the plan. 

After reading the reviews, and forming my own opinion, I will start by saying that I think the plan is already strong. It's already quite detailed and it's fantastic that you have clearly marked the content that is subject to change as we move through the RR process. While there are some little improvements to be made of course, much of what the reviewers comment on is smaller details, and some framing and wording. I think that almost everything they've mentioned will genuinely improve the protocol. As such, I encourage your team to address as much as you can in the revision.

Good luck with your revision - I look forward to seeing the next version of this protocol! I doubt I will have to send it to the reviewers again, so I imagine the next round of revisions will be relatively trouble-free compared to this first round. 

Reviewed by ORCID_LOGO, 27 Jun 2024

Thank you for your submission to PCI RR. I am reviewing "Understanding the Role of Climate Change in AppliedResearch: A Qualitative Registered Report​," which as I understand it is the green section of this double manuscript. I hope my comments are helpful as the authors hone their registered report.

INTRODUCTION

The introduction is generally well-written; if there's one improvement I would suggest, it would be to elaborate specifically on how applied researchers have actually influenced policy or practice. One or two examples would probably suffice, especially if there are any relevant to climate change. There's some general discussion of this early on, but I struggled to make the connection between what applied researchers can do and what the authors are suggesting, that there is evidence that researchers have actually impacted climate change practice. I would argue that increased attention is not actually change. This may be a nitpicky point, but I think it would help people have a clearer picture of what kinds of changes are being discussed. If there aren't specific examples, it may be beneficial to walk back the sentences that suggest applied researchers have influenced practice and stick to the responsibility of applied researchers moving forward.

METHOD

I just wanted to note I appreciate the international data collection efforts! Given the burden of climate change and relative power different countries have to impact pollution levels of their own and other countries, I think it would be good to pay attention to this aspect when analyzing the results. That is to say, I wonder if geographic region will affect people's reporting of their experiences, actions, and impacts. I don't think it's necessary to explicitly compare, but would be helpful to consider how this could be attended to.

I also appreciate the reflexivity that is being built into the process, and the balancing of privacy needs and reflection.

PARTICIPANTS

Point of clarification: If the PhD work is applied in nature, is it still considered? In other words, are the authors excluding people who are just doing research because their degree requires it, without consideration of applied benefits, or are they excluding anyone doing PhD-level research, even if it's applied in nature? I think the authors mean the former, but want to be clear.

ANALYSIS

Are there any reflexive practices that are built in for the analysts?

 

I have no other comments. I am happy to look at the manuscript again if needed, but am also fine if the recommender feels the authors have addressed my comments fully (or that they can ignore certain suggestions). I look forward to seeing how the registered report turns out!

 

Reviewed by , 21 Jul 2024

Reviewed by ORCID_LOGO, 20 Aug 2024

This is my first PCI review and first time reviewing programmatic Registered Reports, though I've reviewed many other documents over the past decade across disciplines.

I found the study ideas intriguing and am most curious about the Open Scholarship Registered Reports given my interest in applied research. I'm looking forward to reading the eventual paper! My critical comments are below.

---------------------------------------------------------------------------------

Programmatic Registered Report 1
Open Scholarship

Introduction

Citation/explication needed for "However, such open scholarship practices have been nearly exclusively applied to basic and quantitative academic research, and there are many reasons why these practices may not be common or considered so favorably in applied settings..." You should find several specific citations, but here is one to get started with citation tracing:

Survey of open science practices and attitudes in the social sciences
By Joel Ferguson, Rebecca Littman, Garret Christensen, Elizabeth Levy Paluck, Nicholas Swanson, Zenan Wang, Edward Miguel, David Birke, John-Henry PezzutoContainer: Nature CommunicationsYear: 2023Volume: 14Issue: 1DOI: 10.1038/s41467-023-41111-1URL: https://www.nature.com/articles/s41467-023-41111-1
 

Citation needed for each practice mentioned in "wide range of practice." Great list, by the way! This is my first encounter with pre-mortems, and they remind me of methodological review panels (Lakens: https://pubmed.ncbi.nlm.nih.gov/36596953/). You can also remove the parentheses in that paragraph mentioning the "wide range of practices" and simply italicize each practice like "premortems to try and anticipate...." This is for ease of reading.

RQ1: Remove the dual commas: ", and using," because they are not needed.

I would revise: "contemporary practices and feedback mechanisms" to read "contemporary research reform practices and study feedback mechanisms for authors..." as that is more specific and clear if readers are skimming. In general, your use of "feedback mechanisms" seems broad throughout the paper and a more specific term would help. I like "research plan feedback mechanisms", "study feedback strategies", or something similar.

pg 4, paragraph 3

"Applied research" here needs to be grounded in concrete examples across 3+ domains with vignettes. Although you defined it earlier, we need to know the details of the term to make sense of this. I interpret this to mean action research (which changes as it is being done). Please clarify this part with specific methods and examples.

RQ2

This is a question that I am mulling over myself in my institution (informally). I am curious to read your findings later. I expect that these feedback processes will be informal and operate for lab meetings, colloquia, conferences, and listservs. I see great potential for systematically investigating this topic and using the results to support the diffusion of author feedback mechanisms. I hope that you make a distinction between formal and informal feedback mechanisms in your papers.

Can you write a brief section on why RR1 is so much shorter than RR2? According to the PCI website, additional information is recommended that you included in RR2: "Authors of a programmatic Stage 1 RR should ensure that all the usual criteria for a RR are met, including detailed specification (where applicable) of theory, hypotheses, procedures, and analysis plans (see review criteria in Section 2.1)." The Method and analysis plan is most critical for RR1.

-----------------------------------------------

Programmatic Registered Report 2

Understanding the Role of Climate Change in Applied Research: A Qualitative Registered Report

Are "Occupational Psychologists" synonymous with "Industrial and Organizational Psychologists?" The latter term is more familiar to me (US resident). If so, it might be good to mention that. If not, please define the term.

RQs 1 and 2 are interesting, though should be introduced a bit earlier if possible.

Good justification for analysis plan: content analysis

pg. 15: I recommend avoiding violin plots for readability and using density plots instead.

Regarding sampling, the variety of settings (virtual) seems sensible. You could also supplement with visits to conferences and making special requests if needed. I don't have a sense of the difficulty of recruiting applied researchers, but the venues you listed seem very open, and I foresee enthusiasm to participate. Sampling success could be a nice topic to include in a Registered Report by brainstorming reasons for success and failure. What do studies with similar samples struggle with? Financial incentives?

Appendix A:

Looks good, though at some point a distinction can be made between formal (pre-registrations and Registered Reports) vs. informal feedback mechanisms (social media, colleagues nearby).

Appendix B:

Looks good and seems helpful to include.

Appendix C:

1 to 10 pt rating scales are too granular according to modern psychometrics research. 1 to 5 would be sufficient, and participants would find a simplified scale easier to process.

You may also want to visualize that scale here and add labels for each value in the scale for clarity.

All nine prompts in Appendix C are positively framed and may lead to biased results, so you should find a way to word them neutrally or negatively. This is often advised by survey methodologists. Please note for readers whether this will be administered verbally or via computer. Given the "Open Text Box" line at the end, I assume this will be on a computer. So then you should just create nine additional negatively worded prompts like "I did not enjoy contributing to this project" to ensure that you don't skew responses to the positive end of the rating scale. Then you can perform some simple arithmetic to create sum scores per theme (e.g., enjoyed contributing, learned a lot, developed my research skills, etc.). Additional information can be found in textbooks like Survey Methodology 2nd Edition by Robert M. Groves, Floyd J. Fowler Jr., Mick P. Couper, etc.