Close printable page
Recommendation

What are the barriers and facilitators to open science practices for researchers, policy makers and media representatives in Slovakia?

ORCID_LOGO based on reviews by Crystal Steltenpohl, Peter Branney, Andrea E. Abele-Brehm , Emma Norris and 1 anonymous reviewer
A recommendation of:

Barriers and facilitators to the adoption and promotion of Open Science practices in psychology. The case of Slovakia

Abstract

EN
AR
ES
FR
HI
JA
PT
RU
ZH-CN
Submission: posted 29 April 2024
Recommendation: posted 28 August 2024, validated 06 September 2024
Cite this recommendation as:
Pennington, C. (2024) What are the barriers and facilitators to open science practices for researchers, policy makers and media representatives in Slovakia?. Peer Community in Registered Reports, . https://rr.peercommunityin.org/articles/rec?id=778

Recommendation

Open science practices (OSPs, e.g., preregistration, open materials, code and data) aim to enhance the transparency, integrity, and reproducibility of research. Recent work, however, has highlighted various facilitators and barriers perceived by researchers in implementing these, which can either enhance or hinder their success. Little is known about these barriers in the context of Slovakia, and such perceptions are rarely investigated for policy makers and media representatives who are also embedded in the research ecosystem.
 
In their Stage 1 Registered Report, Marcel Martončik and colleagues aim to map the perceptions and experiences of barriers and facilitators of OSPs that are unique to different stakeholder groups in Slovakia. They will conduct both semi-structured interviews and focus groups with a diverse sample of postgraduate students, researchers, policymakers, and media representatives from the field of psychology. Reflexive thematic analysis will identify overarching themes regarding such barriers and facilitators which will provide valuable insights into the support required to make OSPs normative across different stakeholder groups.
 
Four expert reviewers assessed the Stage 1 manuscript across two rounds of in-depth review. Based on the authors' detailed and informed responses to the reviewer’s comments, the recommender judged that the manuscript met the Stage 1 criteria and awarded in principle acceptance (IPA).
 
URL to the preregistered Stage 1 protocol: https://osf.io/n86um
 
Level of bias control achieved: Level 6. No part of the data or evidence that will be used to answer the research question yet exists and no part will be generated until after IPA.
 
List of eligible PCI RR-friendly Journals:
 
 
References
 
Martončik, M., Adamkovič, M., Baník G., Fedáková, D., Issmailová, S., Kačmár, P., Kentoš, M., Majdáková, V., Papcunová, J., & Vargová, L. (2024). Barriers and facilitators to the adoption and promotion of Open Science practices in psychology. The case of Slovakia. In principle acceptance of Version 1.1 by Peer Community in Registered Reports. https://osf.io/n86um
Conflict of interest:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

Reviews

Evaluation round #2

DOI or URL of the report: https://osf.io/268rt

Version of the report: 1.1

Author's Reply, 19 Aug 2024

Download author's reply Download tracked changes file

Dear Charlotte,

I apologize if I made any mistakes while uploading the file. I have uploaded the manuscript with tracked changes in odt format to OSF at https://osf.io/dyfwp. Additionally, I have attached it in pdf format. I hope this works. Thank you for your understanding, and I apologize for any inconvenience.

Sincerely yours,

Marcel

Decision by ORCID_LOGO, posted 15 Aug 2024, validated 17 Aug 2024

Dear Marcel Martončik and colleagues,

I have now received the peer-reviews of three experts who also reviewed Version 1 of your Stage 1 Registered Report protocol. However, in order for me to write my final decision/recommendation, I need to be able to see the tracked changes version of your Stage 1 report. 

The document submitted is titled "manuscript_v1.1tch", suggesting tracked changes should be present. However, I see no tracked changes in either the tracked changes mode or as highlights/font-colour-change in this document.

Can you please supply the manuscript with tracked changes at your earliest convienence? 

Best wishes,

Charlotte

Reviewed by , 22 Jul 2024

The proposal has considerably improved. I still think that the number of interviews is small, but this does not hinder me to recommend the proposal as it is now.

Reviewed by , 24 Jul 2024

The authors have taken great care to address my previous comments in detail. I am happy to accept.

Reviewed by ORCID_LOGO, 12 Aug 2024

Thanks for your revisions. I am focusing responses to my comments, but other things may come up if other changes affect how I read the new manuscript.

In short, I have no additional comments. I respect the authors' decision to separate out the positionality statements (it isn't my personal preference, as it makes it feel ancillary to the research), and it would be helpful to ensure the information about Slovakian context for psychology in the actual text. Otherwise, their responses to my comments are reasonable and as long as their decisions and the implications of those decisions are clearly described in the final manuscript, I think it'll be fine. I look forward to reading the Stage 2 manuscript, as I think this will be an interesting paper. 

Evaluation round #1

DOI or URL of the report: https://osf.io/mb5wg

Version of the report: 1.0

Author's Reply, 15 Jul 2024

Download author's reply Download tracked changes file

Dear Recommender and Reviewers,
We would like to express our sincere gratitude for your thorough and insightful comments on our manuscript. Your feedback has been invaluable in helping us improve the quality and clarity of our work.
We have carefully considered all of your suggestions and have made substantial revisions to the manuscript accordingly. As the majority of our authors primarily focus on quantitative research, your comments have provided us with an excellent opportunity to enhance our skills in qualitative research and to make our approach more rigorous. We view this as a valuable learning experience and are grateful for your expertise in guiding us towards a higher standard of qualitative research.
Thank you once again for your valuable input and guidance throughout this process.
Sincerely,

Marcel Martoncik

Decision by ORCID_LOGO, posted 16 May 2024, validated 16 May 2024

Dear Marcel Martončik and colleagues,

Thank you for submitting your Stage 1 Registered Report “Barriers and facilitators to the adoption and promotion of Open Science practices in psychology. The case of Slovakia” for consideration by PCI Registered Reports.

I have now received comments from four expert reviewers in this field. As you will see, the reviewers do find merit in your proposed study but point out several aspects that need further clarification and/or revision, particularly regarding the methodology. I agree with these reviews and am therefore inviting you to revise your manuscript. 

You may either agree or disagree with some of the reviewer’s comments regarding changes in methodological approach; if the latter, please see this as an opportunity to better justify your methods in the manuscript itself (rather than just the response to reviewers). Please also consider including your reflexivity in the manuscript itself rather than in the supplementary materials. One reviewer mentions changes to your referencing style, assuming APA formatting is required; it is up to you which referencing style you wish to use. 

I have also undertaken an independent peer review of this manuscript myself and provide the following comments separated into what I believe to be major and minor points.

Major comments:

1.    The focus group questions place attention on the umbrella term of “open science” rather than specific OSPs (e.g., preregistration) as is the case with the individual interviews with researchers. Whilst this is understandable given that the focus group sample will be media representatives and policymakers, there is a potential risk that the questions will yield very shallow answers or that respondents will answer in very broad and generic terms. How will you mitigate against this, and will you analyse/report the focus groups in a different way than the individual interviews? The latter could be better clarified in the analysis section. 

2.    Each author of this manuscript may want to reflect on their subject discipline in the reflexivity (thank you for sharing this on the OSF): the authors come from different disciplines (e.g., Psychology vs. Arts) and may therefore have different perceptions and experiences with open science generally and different open science practices more specifically (i.e., Registered Reports may be an appropriate open science practice for a researcher in social sciences, but perhaps less so for a researcher in the Arts). 

3.       I agree with the reviewer comment by Emma Norris regarding the structure of the Introduction. I noticed that you introduce open science practices (OSPs) but then subsequently discuss how doubt has been cast on the credibility of research findings due to issues of low reproducibility and QRPs. The structure seems ‘off’ here – surely you should mention issues of reproducibility and QRPs before introducing the concept of open science which has been proposed as a corrective mechanism against these issues?

4.       The second paragraph states: “The slow adoption of OSP cannot be solely attributed to researchers.”, but there is no citation/evidence to support such a claim – how do we know it’s slow? Slow in Slovakia, or slow globally? References to bolster this point would be helpful.

5.       You state: “For instance, nearly one-third of journal editors do not deem the implementation of registered reports as important. Similarly, almost one-sixth do not prioritize the publication of null or negative results. Furthermore, approximately half of the research funders do not mandate the sharing of raw data and pre-registration of studies” Where are each of these findings from? If these are also from the European Commission (2022) report, you should clarify this (e.g., “this report highlights that..”). 

6.       In the Introduction you state “The barriers to adopting OSP vary not only between disciplines (Bouter, 2018) but very likely also between institutions, countries and cultures, and different parts of the research ecosystem”. Surely OSPs also vary between distinct methodologies too (e.g., quantitative vs. qualitative). Will your methodology capture this? The Stage 1 RR does not acknowledge these different tensions (e.g., see Pownall’s 2023; 2024 papers on tensions for qualitative research). Consider whether this needs to be included specifically within your Research Questions. 

7.       Methods: You suggest that there is an “absence of generally accepted rules” for sample size guidelines when using thematic analysis. However, Braun and Clarke (2013, e.g. pp. 50) provide some guidelines for this that may bolster your sample size rationale. Reference: Braun, V., & Clarke, V. (2013). Successful qualitative research: A Practical Guide for Beginners (First Edition). SAGE Publications.

 

Minor: 

1.    The Abstract/Intro focuses on the replication crisis underscoring that the mechanism of self-correction in science may not be functioning effectively. It would be worthwhile to specify that the ‘replication crisis’ stemmed from the social sciences/psychology. 

2.    The Abstract states that the study aims to “conduct a qualitative examination of the barriers and facilitators of transparent and responsible research practices in the field of psychology in Slovakia” and then later states “Data will be collected through interviews and focus groups with a diverse sample of master’s and PhD students, researchers, policy makers, and media representatives.” It would be useful to specify here that the policy makers and media representatives will also be from a psychology background. 

3.    The opening sentence of the Introduction states: “Corrections can only be made through concerted, targeted action and collaboration among all stakeholders in the research ecosystem, including researchers, institutions, funders, publishers, and learned societies (Munafò et al., 2022).” Another relevant reference here, which states exactly this, is: https://bmcresnotes.biomedcentral.com/articles/10.1186/s13104-022-05949-w

4.    In paragraph 2 of the Introduction, you introduce the concept of “the credibility crisis”, but elsewhere you use the term “replication crisis”; please be consistent with whichever term you decide to use throughout. 

5.       Introduction: “Obviously, implementing OSP faces different barriers”; do you mean “individuals implementing OSP face different barriers?”. This sentence reads rather awkwardly. 

6.       There are some typos throughout that should be fixed (e.g., “(Directive EU 2019/1024 of the European Parliament), „Member States must adopt policies”)”. 

7.       You abbreviate to “OSPs” early on in the manuscript but then revert back to “open science practices” in the analysis plan – be consistent with this abbreviation. 

 

With best wishes,

Dr Charlotte Pennington,

PCI Recommender

Reviewed by anonymous reviewer 1, 03 May 2024

The proposal deals with an important and timely toipic, e.g., barriers and facilitators of openscience usage.

My comments concern mainly the methodology:

1. There should be at least 25 interviews with researchers - in order to study researcher on different levels (Post-doc, assistant, professor, etc.)

2. I do not see what the student focus groups should add to the present topic. These could be omitted and - see my point 1 - more effort could be taken to study researchers.

A relevant article here could be: Gute wissenschaftliche Praxis und Open Science im Empiriepraktikum: Wissenschaftlicher Kompetenzerwerb durch Replikationsstudien. Christoph Scheffel, , Franziska Korb,, Denise Dörfel, , Julian Eder, , Marcus Möschl, Martin Schoemann und Stefan Scherbaum
 Published Online:  13 Sep 2023    Doi: https://doi.org/10.1026/0033-3042/a000643 

 

3. A study by Abele-Brehm et al. (Abele-Brehm, A.E., Gollwitzer, M., Steinberg, U. & Schönbrodt, F. (2019). Attitudes towards Open Science and Public Data Sharing: A Survey among Members of the German Psychological Society. Social Psychology, 50, 252-260. https://doi.org/10.1027/1864-9335/a00038420 ) studied attitudes towards open science, and this research might help to develop coding categories.

4. Generally, I think that authors did not catch all the relevant literature; and I think that the approach could be more of a mixture of bottom-up (like it is now) and top-down (like it could be by considering relevant literature).

Reviewed by ORCID_LOGO, 16 May 2024

Thank you for the opportunity to peer review this Stage 1 Registered Report exploring the barriers and facilitators to open science research in Slovakia. Below, I have structured my review according to the criteria for assessing a Registered Report at Stage 1 from PCI RR (accessed 2023-07-13; [PCI Registered Reports (peercommunityin.org)](https://rr.peercommunityin.org/help/guide_for_reviewers#h_6720026472751613309075757))


## 1A. The scientific validity of the research question(s).
This criterion addresses the extent to which the research question is scientifically justifiable. Valid scientific questions usually stem from theory or mathematics, an intended or possible application, or an existing evidence base, and should be defined with sufficient precision as to be answerable through quantitative or qualitative research. They should also fall within established ethical norms. Note that subjective judgments about the importance, novelty or timeliness of a research question are NOT a relevant criterion at PCI RR and do not form part of this assessment.

- The research questions are justified to a) an argument for the need to adopt a range of open sciences practices to increase trust in science and b) a range of literature on the barriers and facilities to their uptake. Furthermore, the registered report differentiates between bottom-up and top-down approach to barriers and facilitators, emphasising that, for example, an individual researcher deciding whether or not to implement an open research practice is doing so within a wider system that may be formed of barriers or facilitators. I would note that there are other theoretical perspectives that be more useful, such as the COM-B, in outlining a more nuanced understanding of the adoption (or not) of open science practices. You might be interested in a similar Registered Report, although focused on data sharing only, that uses the COM-B: [Capability, Opportunity, and Motivation in Data Sharing Behaviou... (peercommunityin.org)](https://rr.peercommunityin.org/articles/rec?id=462)

 

## 1B. The logic, rationale, and plausibility of the proposed hypotheses.
This criterion addresses the coherence and credibility of any a priori hypotheses. The inclusion of hypotheses is not required– a Stage 1 RR can instead propose estimation or measurement of phenomena without expecting a specific observation or relationship between variables. However, where hypotheses are stated, they should be stated as precisely as possible and follow directly from the research question or theory. A Stage 1 RR should also propose a hypothesis that is sufficiently conceivable as to be worthy of investigation. The complete evaluation of any preliminary research (and data) in the Stage 1 submission (see [**Section 2.7**](https://rr.peercommunityin.org/help/guide_for_reviewers#h_6154304112661613309500361)) is included within this criterion.

- Regarding the research questions, can you clarify your theoretical perspective? In particular, can you clarify you ontology and epistemology as this will help the reader to assess the ability of this study to generate knowledge that is relevant for the research questions. What is it, for example, to identify in a qualitative study that a factor is 'most helpful'? Are you, for example, taking a naïve realist approach in which an interviewee's reports of what is helpful is treated unquestionably as truth? Or are you doing something else? Also, could you explain how this relates to your theoretical perspective or bottom-up and top-down approaches to barriers and facilitators (or to another theoretical approach if you adapt something like COM-B).
- Given the range of research on barriers and facilitators you summarise, I wonder if it would be useful to synthesis it in a table or figure and clarify the potential overlap between the studies (which may prove useful in your discussion)


## 1C. The soundness and feasibility of the methodology and analysis pipeline (including statistical power analysis where applicable).
This criterion assesses the validity of the study procedures and analyses, including the presence of critical design features (e.g. internal and external validity, blinding, randomisation, rules for data inclusion and exclusion, suitability of any included pilot data) and the appropriateness of the analysis plan. For designs involving inferential statistics and hypothesis-testing, this criterion includes the rigour of the proposed sampling plan, such as a statistical power analysis or Bayesian alternative, and, where applicable, the rationale for defining any statistical priors or the smallest effect size of interest. For programmatic RRs (see [**Section 2.15**](https://rr.peercommunityin.org/help/guide_for_reviewers#h_52492857233251613309610581)), this criterion captures the assessment of whether the separate study components are sufficiently robust and substantive to justify separate Stage 2 outputs.

- Could you give a brief description of the overall population from which you are sampling? As you are likely to be writing for an international audience, readers may benefit from this brief introduction to the research landscape in Slovakia.
- Can you check if abbreviations are necessary, and if they are, please ensure all are defined. E.g. Table 1 has a few.
- Can you clarify what language/s the interviews will be conducted?
- Can you clarify what language/s the analysis will be conducted?
- After clarifying your theoretical and epistemological position (as mentioned above), can you ensure your analysis is consistent with it? If you continue to draw upon Braun and Clarke for your thematic analysis, can you please describe how you are conceptualising a theme according to their four dimensions (of a theme; in their 2016 paper)?
- As you are doing a reflexive thematic analysis, can you following the APA Journal Article Reporting Standards for qualitative studies and ensure you provide a researcher description? For example, could you include a positionality statement akin to [Capability, Opportunity, and Motivation in Data Sharing Behaviou... (peercommunityin.org)](https://rr.peercommunityin.org/articles/rec?id=462) or [https://doi.org/10.31234/osf.io/5yw4z](https://doi.org/10.31234/osf.io/5yw4z). Given the topic of this research, it would be interesting to highlight your positions in relation to the open practices you mention. 
- Given the topic, can you give more details on how you are negotiating data sharing for this study? Perhaps an appendix where you 1) show how you are achieving it with relevant items in the information given to potential participants and in the consent form, 2) a mapping of the FAIR principles against your planned data archive (as in the example in Tables 2 and 3 in https://doi.org/10.1111/spc3.12728) and 3) perhaps reflections on and/or descriptions about the support you have had and/or anticipate having in sharing the data (that may need completing at Stage 2). If you look at the FAIR principles, I would have imagined a dedicated data archive, such as the UK Data Service, which you mention, would help in terms of making it 'reusable' because of the range of standardised meta-data they request. For example, I've seen other OSF projects where the data is difficult to find and would question if they would appear through library database or Internet search. Note that I raised a similiar point in reviewing the Registered Report menionted above and you can see how they resolved it [OSF | Henderson-etal-PCIRR-Stage1-V3-Clean.pdf](https://osf.io/bz9h6?view_only=c91a36012190462e8416cba250bdb8ed).

## 1D. Whether the clarity and degree of methodological detail would be sufficient to replicate the proposed experimental procedures and analysis pipeline.
This criterion assesses the extent to which the Stage 1 protocol contains sufficient detail to be reproducible and ensure protection against research bias, such as analytic overfitting or vague study procedures. In general, meeting this requirement will require the method section(s) of a Stage 1 protocol to be significantly longer and more detailed than in a regular manuscript, while also being clearly structured and accessible to readers. This criterion also covers the extent to which the protocol specifies precise and exhaustive links between the research question(s), hypotheses (where applicable), sampling plans (where applicable), analysis plans, and contingent interpretation given different outcomes. Authors are strongly encouraged to include a design summary table in their Stage 1 protocols that make these links clear (see [**Section 2.16**](https://rr.peercommunityin.org/help/guide_for_reviewers#h_27513965735331613309625021) for examples). Note that in some circumstances, authors may wish to propose a more general analysis plan involving a [blinded analyst](https://link.springer.com/article/10.1007/s11229-019-02456-7) rather than a precise specification of data analyses. Such submissions are acceptable and will meet this criterion provided the blinded analysis procedure is specified in reproducible detail, and provided the blinding procedure itself is sufficiently robust.

-  I think describing the epistemological position would help in understanding how the rationale for the study, research questions and data analysis link up and how they will link up with the findings in Stage 2.

## 1E. Whether the authors have considered sufficient outcome-neutral conditions (e.g. absence of floor or ceiling effects; positive controls; other quality checks) for ensuring that the results obtained are able to test the stated hypotheses.

- NA

Reviewed by , 02 May 2024

Thank you very much for sharing the Registered Report “Barriers and facilitators to the adoption and promotion of Open Science practices in psychology: The case of Slovakia” with me and asking for feedback.

The report presents a qualitative study to assess current barriers and facilitators of responsible research practices in psychology within Slovakia.

The report is relatively clearly structured but various points of clarification are required. My overall review is positive.

 

Important points

1.      The structure of the Introduction requires clearer structure: currently changing between discussions of transparency, what OS practices are and why they’re important, extent they are carried out. Sub-headings may support a clearer structure. Paragraphs are extremely long in places e.g page 4.

2.      The Slovakia-specific context of Open Science in Slovakia is not discussed in the Introduction. For example, data on the extent Open Science practices are implemented within Slovakia is not clear. Can you pull-in examples to illustrate the extent this is an issue? What institutional structures exist to facilitate Open Science e.g Slovak Reproducibility Network? It is not sufficient to provide a table of Slovak Open Science initiatives as a supplementary document – please summarise within the introduction to add important context.

3.      It would be useful to distinguish qualitative research that has investigated Open Science practices, from quantitative (e.g survey) research. What has been learned from qualitative studies? Why is a qualitative approach important here? How have these qualitative studies specifically informed this study?

4.      Rationale for individual interviews (‘researchers’, PhD students’) and focus groups (‘students’, ‘policy makers’ and ‘media) are not clearly justified. What are the theoretical and pragmatic reasons for these methodological decisions?

5.      Sample size estimations by group seem relatively arbitrary and not justified. Why are the same number of ‘researcher’s to be recruited versus ‘students’ (undergraduate and master students) for example?

6.      The summary of your measures (interview and focus group schedules) is vague. What Open Science practices are asked about? What informed the development of your interview/focus group schedules? Were these informed by previous qualitative studies – if so, how?

 

Minor points

7.      Ensure in-text references are presented in alphabetical order e.g (Armeni et al., 2021; Nosek et al., 2015; Obels et al., 2020) rather than (Obels et al., 2020; Nosek et al., 2015; Armeni et al., 2021).

8.      Clear aims and objectives of this study are missing from the end of the Introduction.

9.      Research questions should be reformulated to refer to Slovakia specifically.

10.  How do you distinguish between ‘challenges’ and ‘barriers’? If these are used to mean the same, please use just one throughout.

11.  Ethical approval number is missing (pg 5). Note that amendments may be required following any subsequent protocol changes following Stage 1 review.

12.  A codebook is not described as being developed as part of the data analysis process. This would be useful supplementary material.

Reviewed by ORCID_LOGO, 13 May 2024

I would like to thank the authors for their submission to PCI RR. While my review focuses largely on my questions and suggestions (for the sake of time), I would like to say that I think this is an interesting study with the potential to produce some useful insight, and I hope the authors find my comments helpful as they refine their stage 1 RR. I would be happy to review another version of this if the recommender requests it; I am traveling quite a bit this summer but would be happy to prioritize this as much as I can.

INTRODUCTION

  • It would be helpful to break the text so that there's one main idea per paragraph - there are a couple of times where a paragraph covers most if not all of the page, and it's a bit difficult to follow.
  • It's unfortunate that the positionality statement is inserted as an appendix, and without much discussion about how the authors' understanding of (open) science affects how they are approaching this work. I think it would be incredibly beneficial, both for the authors and for the audience, if the authors dig a bit deeper into their espistemological stances and how those might influence how they define open science and what practices "count" as open science - for instance, they mention open data, materials, etc., in their interview protocols, but what of member checking, participatory research, stakeholder/participant involvement? It would be great to see what, if any, strategies they are employing to highlight those perspectives and/or ensure that they do not become myopic in their approach. It would also be helpful for this to be integrated into the main text.

PARTICIPANTS

  • I think the sampling strategy is largely fine, but it is worth noting that psychology is a very broad field. It may be helpful for the authors to include more thick description about the state of psychology within Slovakia so the audience can understand what subfields are most common in Slovakia, and/or describe how the authors will ensure that there will be diversity in each of the researcher, PhD student, and other student samples. If the state of the field of psychology in Slovakia is as broad as it might be elsewhere (I imagine it is), the authors may wish to either narrow the subfields they draw from, or sample more participants from those three groups in particular. (It's fine for the sample sizes to be different across groups.)

METHODS

  • A few of the questions in the interview protocols are close-ended, e.g., "Have you encountered open science practices in the course of your work?", "Is open science and its role sufficiently visible in public discourse?", etc. It would be good reword these to encourage elaboration, e.g., "How have you encountered open science practices in the course of your work?", "How do you see open science manifest in public discourse, if at all?" It may also be helpful to consider probes to assist interviewers to dig deeper on certain responses. It would also be helpful to consider how someone who supports open science might respond and how someone who is more ambivalent or antagonistic toward certain (or all) open science practices might respond to each question, as a few questions may come across as assuming the respondent's perspective.
  • I love that the authors have included both debriefing and reflection opportunities. It would be great to read about how these will be used in the analysis of their interviews.

ANALYSIS

  • The authors state that they will use both inductive and deductive strategies, but it's not clear what their inductive codes are.
  • Will the codebooks be shared? If so, what will they include (e.g., code, definition, inclusion and exclusion criteria, exemplar quotes)?
  • Are the authors using a consensus approach to coding, then (rather than calculating kappa, etc.)? I think this is a good approach, but it's not directly stated.
  • The authors mention that they are going to try to map out potential causal relationships. Who is determining the "potential" for the causal relationship? The respondents? The authors? Can this be expanded upon some?
  • How will transcripts be de-identified? A resource that may be helpful, at least by means of creating and describing a process (not that I think the transcripts will cover traumatic material - I just really love how Campbell and colleauges mapped their process out):
    • Campbell, R., Javorka, M., Engleton, J., Fishwick, K., Gregory, K., & Goodman-Williams, R. (2023). Open-science guidance for qualitative research: An empirically validated approach for de-identifying sensitive narrative data. Advances in Methods and Practices in Psychological Science, 6(4), 25152459231205832.
  • I think it would be helpful for the authors to state some of the things they think they will find. This will be immensely beneficial as they analyze their results - did they only find what they expected to find? If so, is this a indicative of anything? If not, what have they learned? Will they be looking for negative cases? (If so, what does a negative case look like here?)
  • Can we link this back to the research questions a bit more? Are the authors looking to describe? Are they looking to generalize? If so, to what extent (to what population)? How will they know if they've answered their questions?
  • I think there's some opportunity to triangulate here. The authors have described a few methods - there's also the Center for Open Science's Open Scholarship Survey (https://osf.io/nsbr3/). Will the authors triangulate any, and if so, how?

OTHER NOTES

I'd like to recommend a few pieces from qualitative realm (still heavily, though not entirely, psychology) that could be helpful - the authors absolutely do not need to reference any/all of them, but they cut across a couple of epistemic discussions that the authors may find interesting and wish to include. This may also inspire them to dig a bit depeer in their positionality statements and reflexivity practices generally around the perspectives they are bringing and reinforcing through their work, either now or as they work on this project. I think it's fine to consider largely reproducibility and replicability in open science, but it's worth noting that this is not the only way to view open science (and might not even be what most people are actually motivated by when engaging with open science).

  • Bahn, S., & Weatherill, P. (2013). Qualitative social research: A risky business when it comes to collecting ‘sensitive’ data. Qualitative Research, 13(1), 19-35.
  • Bennett, E. A. (2021). Open science from a qualitative, feminist perspective: Epistemological dogmas and a call for critical examination. Psychology of Women Quarterly, 45(4), 448-456.
  • Class, B., de Bruyne, M., Wuillemin, C., Donzé, D., & Claivaz, J. B. (2021). Towards open science for the qualitative researcher: From a positivist to an open interpretation. International Journal of Qualitative Methods, 20, 16094069211034641.
  • Field, S. M., van Ravenzwaaij, D., Pittelkow, M. M., Hoek, J. M., & Derksen, M. (2021). Qualitative Open Science–Pain Points and Perspectives.
  • Humphreys, L., Lewis, N. A., Sender, K., & Won, A. S. (2021). Integrating qualitative methods and open science: Five principles for more trustworthy research. Journal of Communication, 71(5), 855-874.
  • Jacobs, A. M. (2020). Pre-registration and results-free review in observational and qualitative research. The production of knowledge: Enhancing progress in social science, 221-264.
  • Makel, M. C., Meyer, M. S., Simonsen, M. A., Roberts, A. M., & Plucker, J. A. (2022). Replication is relevant to qualitative research. Educational Research and Evaluation, 27(1-2), 215-219.
  • Pownall, M. (2024). Is replication possible in qualitative research? A response to Makel et al.(2022). Educational Research and Evaluation, 1-7.
  • Pownall, M., Talbot, C. V., Kilby, L., & Branney, P. (2023). Opportunities, challenges and tensions: Open science through a lens of qualitative social psychology. British Journal of Social Psychology.
  • Stegenga, S. M., Steltenpohl, C. N., Renbarger, R., Lee, L. E., Standiford Reyes, L., Lustick, H., & Meyer, M. S., PhD. (2023, September 5). Open Science Practices in Early Childhood Special Education Research: A Systematic Review and Conceptual Replication. https://doi.org/10.31234/osf.io/8gbjp
  • Steltenpohl, C. N., Lustick, H., Meyer, M. S., Lee, L. E., Stegenga, S. M., Reyes, L. S., & Renbarger, R. (2022). Rethinking transparency and rigor from a qualitative open science perspective. Journal of Trial and Error.
  • TalkadSukumar, P., & Metoyer, R. (2019). Replication and transparency of qualitative research from a constructivist perspective.
  • Tuval-Mashiach, R. (2021). Is replication relevant for qualitative research? Qualitative Psychology, 8(3), 365. 

FINAL THOUGHTS

I think this is an interesting project with the potential to garner insight to the workflows, working conditions, and perspectives of Slovakian researchers. This is incredibly exciting to me, and I hope my comments have been helpful. Again, I'm happy to take another look at this and will do my best to prioritize reviewing any revision(s) if they do come across my digital desk.