Close printable page
Recommendation

Understanding how visual cues influence extraretinal representation of planar symmetry

ORCID_LOGO and ORCID_LOGO based on reviews by Guillaume Rousselet and 2 anonymous reviewers
A recommendation of:

Putting things into perspective: Which visual cues facilitate automatic extraretinal symmetry representation?

Abstract

EN
AR
ES
FR
HI
JA
PT
RU
ZH-CN
Submission: posted 03 June 2024
Recommendation: posted 13 November 2024, validated 13 November 2024
Cite this recommendation as:
Edwards, G. and Dienes, Z. (2024) Understanding how visual cues influence extraretinal representation of planar symmetry. Peer Community in Registered Reports, 100799. 10.24072/pci.rr.100799

This is a stage 2 based on:

Recommendation

Visual symmetry is critical to our interaction with our environment so that when detected, symmetry automatically produces a neural marker in the form of an Event Related Potential (ERP) called Sustained Posterior Negativity (SPN). However, when symmetry is presented to the visual system slanted away from the viewer, there is a reduction in SPN, termed a perspective cost. 
 
Considering ​objects are rarely presented front-on (or frontoparallel) in our natural environment, Karakashevska et al., (2023) examined the extent of the perspective cost with the addition of visual cues to facilitate extraretinal representation of the visual symmetry. The authors recorded electroencephalography (EEG) from 120 participants while they performed a luminance task on symmetrical and asymmetrical stimuli. The authors hypothesized that perspective cost would be reduced by three perspective cues: 1) monocular viewing, eliminating binocular cue conflict, 2) a static frame surrounding the symmetrical stimulus, adding a depth cue, and 3) a moving frame, providing a structure-from-motion 3D cue, prior to the symmetry onset. If the SPN was equivalent during frontoparallel and slanted presentation in a cue condition, the authors would have concluded extraretinal representation can be automatic when sufficient visual cues are available. The experiment was powered to detect a relatively small difference between perspective cue conditions.
 
The authors found that there was no impact of different visual cues on the perspective cost, as measured using the SPN. Perspective cost was consistent across all conditions, contrary to the pre-registered hypotheses. Karakashevska and colleagues conclude that the three perceptual cues tested in their design do not reduce perspective cost. The study prompts future research into the nature of the extraretinal representations of planar symmetry.  
 
The Stage 2 manuscript was evaluated over four rounds by three expert reviewers. Two of our reviewers reviewed the Stage 1 manuscript, and one new reviewer. Following in-depth review and responses from the authors, the recommenders determined that the Stage 2 criteria were met and awarded a positive recommendation.
 
URL to the preregistered Stage 1 protocol: https://osf.io/yzsq5
 
Level of bias control achieved: Level 6. No part of the data or evidence that was used to answer the research question was generated until after IPA.
 
List of eligible PCI-RR-friendly journals:
 

References
  
1. Karakashevska, E., Bertamini, M. & Makin, A. D. J. (2024). Putting things into perspective: Which visual cues facilitate automatic extraretinal symmetry representation? [Stage 2]. Acceptance of Version 5 by Peer Community in Registered Reports. https://doi.org/10.31234/osf.io/z9c28
PDF recommendation pdf
Conflict of interest:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

Reviews

Evaluation round #4

DOI or URL of the report: https://osf.io/preprints/psyarxiv/z9c28

Version of the report: 5

Author's Reply, 13 Nov 2024

Download tracked changes file

 Dear Dr Edwards and Dr Dienes

Thank you for your feedback and for the positive reviews. I’m pleased to hear that the reviewers support the acceptance of the Stage 2 submission.

I’ve addressed your final request and added the date to the relevant footnotes on Pages 7 and 13, as specified. 

Please let me know if there is anything further required.

Best regards,

Elena

Decision by ORCID_LOGO and ORCID_LOGO, posted 12 Nov 2024, validated 12 Nov 2024

Dear Dr. Karakashevska,

We have received three positive reviews who recommend your acceptance and commend your ambitious project.

We want to note that Dr. Rousselet’s comment on normality assumptions is on-point, however as the Stage 1 was approved with the statements regarding testing for normality, so there is nothing for you to edit with regard to the normality testing.

We have one final request before we move to accept your Stage 2 – could you add a date to your footnotes indicating when we agreed the minor edits of the Stage 1 content. Specifically we’re referring to the wording of how the SPN was calculated on Page 7 (the difference between symmetry versus asymmetry waves rather than the originally reported symmetry versus random waves) and the use of the dominant eye rather than left eye on Page 13.

Once the edits have been resubmitted we will move forward with the acceptance of your Stage 2.

Best,

Grace & Zoltan

Reviewed by anonymous reviewer 1, 23 Oct 2024

The manuscript, entitled "Putting things into perspective: Which visual cues facilitate automatic extraretinal symmetry representation?", by Elena Karakashevska, Marco Bertamini, and Alexis D.J. Makin, the authors addressed issues raised by the reviewers. I do not have any further concerns. 

I recommend its acceptance.

Reviewed by ORCID_LOGO, 09 Oct 2024

Thank you for the detailed reply and engaging with all the queries and comments. The changes are appropriate and have addressed all my points. Congratulations on completing your ambitious project. 

The new figures are great and will help set higher standards in the field. This is a very useful contribution that should help widen the impact of your work.

Also great job with the equivalence testing -- that's really the way to do it given how you carefully thought about a minimum effect size of interest for your design. That will be another reason for me to cite your work as an example of good practice.

About the normality assumptions: In practice, nothing we measure in psychology and neuroscience is normally distributed, but the extend to which this is a problem when applying parametric tests is an empirical issue. There are plenty of relatively simple alternatives to t-test on means:
https://currentprotocols.onlinelibrary.wiley.com/doi/full/10.1002/cpz1.719
For instance, by default for ERP data I would recommend to make inferences about 20% trimmed means, to decrease the impact of the tails, which could be contaminated by outliers. But ultimately mixed-effect models will become the norm. 

Reviewed by anonymous reviewer 2, 03 Nov 2024

I have no further suggestions.  The manuscript does an excellent job motivating the experiment, the methods and results are described very clearly and comprehensively, and the conclusions are appropriate.

Evaluation round #3

DOI or URL of the report: https://osf.io/preprints/psyarxiv/z9c28

Version of the report: version marked post recommenders

Author's Reply, 07 Oct 2024

Download author's reply Download tracked changes file

Please find the marked changes file on OSF here https://osf.io/pqdf2 as it was too large to upload below. 

Decision by ORCID_LOGO and ORCID_LOGO, posted 23 Aug 2024, validated 26 Aug 2024

Dear Dr. Karakashevska,

We have received reviews from two of our original reviewers and one new reviewer. Each reviewer carefully examined the Stage 2 submission against the Stage 1 IPA and felt the authors adhered to the original study design from the Stage 1 well.

Dr. Guillaume Rousselet had some suggestions on the reporting of the data. Specifically, Dr. Rousselet requested the behavioral results reported on page 21 be plotted using scatterplots, revealing the distribution of individual effects. They provided an editorial to illustrate graphical representations of such data. For figures 8, 10, and 11, Dr. Rousselet also requested the authors plot the individual subject data, revealing the data in its entirety. We agree the variability in the data is not currently illustrated and support this suggestion. Dr. Rousselet has outlined suggestions for how this might be done for each figure. Finally, we also wanted to address Dr. Rousselet’s suggestion of removing the inferential tests in the exploratory analysis section. At PCIRR it is standard that the inferential tests are included in the exploratory analyses. Their evidential weakness compared to the pre-registered analyses is made apparent by being listed explicitly as exploratory. Further, the abstract and discussion of the manuscript should focus on the preregistered analyses.

Anonymous reviewers 1 and 2 were content with the authors representation of their findings and provided some grammatical suggestions. Anonymous reviewer 1 also requested the authors revisit the paragraph starting “Given the current results…” on Page 27 to clarify their conclusion of the existence/absence of extraretinal representations and how it relates to the data in the Stage 2.

Please submit a point-by-point reply to our reviewers and revise your manuscript accordingly.

Best,

Grace & Zoltan

Reviewed by anonymous reviewer 1, 12 Jul 2024

The manuscript, entitled "Putting things into perspective: Which visual cues facilitate automatic extraretinal symmetry representation?", by Elena Karakashevska, Marco Bertamini, and Alexis D.J. Makin, the authors addressed issues raised by the reviewers.
This manuscript is a Stage-2 registered report. The authors collected data, analyzed the data, and reported the results with following the registered procedure. The authors conducted some additional exploratory analyses. The additional analyses are rational and they are clearly separated from the planned analyses in the manuscript.

I also found that the reported results and the authors’ discussion on the results are interesting. I only have a few very-minor issues.

 

P. 20. > In each block, we obtained obtain …

I think "obtained obtain" is a typo.

 

P. 22. Figure 8.

The description of the left two columns of the figure is missing.

 

P. 25. > … differences significantly less that our …

Less than?

 

P. 27. > Given the current results, we considered whether the brain …

This paragraph is not very clear. It will be great if the authors can revise the paragraph to fill the leaps of logic in the paragraph. From which part of the reported results, the authors considered the existence/absence of “extraretinal representations” in the visual system? The relationship between the ecological approach and the invariants is missing. The sentence about the “perceptual experiences check” is important for the conclusion of this paragraph but the sentence is not very clear.

Reviewed by ORCID_LOGO, 06 Aug 2024

Reviewed by anonymous reviewer 2, 19 Jul 2024

Past work suggests that frontoparallel symmetry is detected automatically (an ERP difference [the 'Sustained Posterior Negativity', or SPN] between symmetric and asymmetric displays is observed both when symmetry is task-relevant, and when symmetry is task-irrelevant).  On the other hand, when the same planar displays are rotated so that they are seen from an oblique perspective angle, this 'extraretinal symmetry' is not detected automatically (an SPN ERP difference is observed only when symmetry is task-relevant).  However, given the importance and apparent effortlessness of detecting extraretinal symmetry, the authors ran a high-powered study to test how adding different depth cues might aid the recovery of task-irrelevant extraretinal symmetry, as indexed by the SPN.


Specifically, the authors predicted that the perspective cost (difference in SPN for frontoparallel and extraretinal displays) would be greatest when displays were viewed binocularly,  and lower in blocks of trials in which the displays were viewed (1) monocularly (which reduces conflicting disparity cues), (2) in a surrounding frame (which provides additional perspective cues), and (3) in a moving frame (providing motion parallax depth cues prior to the onset of the pattern).  Given that the moving frame block had the most depth information, they also predicted that in this block the perspective cost will be lower than in the static frame block, and possibly not different from 0 (as determined by one-sided equivalence testing).


Contrary to predictions, the authors observed a statistically equivalent perspective cost in all blocks of trials.  Directly viewed displays produced a larger SPN than obliquely viewed displays in all four blocks (and this difference was of about the same magnitude across all four blocks).


Overall, I was impressed by this proposed work.  The experiment was well motivated by the past literature.  The study design was clear and efficient for answering these research questions.  The  statistical tests were appropriate, and the sample size was well justified.  And the authors carried out the work as promised.  It is surprising that the perspective cost was not reduced by removing conflicting depth cues or by adding depth cues which should have been helpful for detecting extraretinal symmetry.  The authors discuss possible reinterpretations of past work based on their results, and possible manipulations which could be tried in the future (e.g. really rotating the computer monitor).

 


To answer the specific questions posed by the review request:


2A. Yes, the data are able to test the authors' proposed hypotheses by passing the approved outcome-neutral criteria.
 
2B. Yes, the introduction, rationale and stated hypotheses are the same as the approved Stage 1 submission. 
 
2C. Yes, the authors adhered precisely to the registered study procedures.  (In one minor case they departed from registered study procedures, but they mentioned this in a footnote.)

2D. The additional exploratory analyses and visualizations (which start on p. 25) are generally informative. I certainly think it was helpful to confirm the unpredicted null effects with equivalence testing. For each of the correlation matrices, the authors might write a little more about their reason for making it. E.g. or each matrix, was there a different way that it could have looked, which would have suggested a different reason for the hypothesized effects' not occurring?


2E. Yes, the authors' conclusions are justified by the evidence.


Minor comments:


P. 7: Change "within subject’s" to "within subjects"


P. 12, second to last line: Replace "SD = 0.561 t (123)" with "SD = 0.561, t (123)"


P. 13: Change "Similar stimuli have previously been found generate" to "Similar stimuli have previously been found to generate"


P. 17: Change "The participants were first presented with the instructions for the experiment and were informed that task was to classify dot element luminance." to "The participants were first presented with the instructions for the experiment and were informed that the task was to classify dot element luminance."


P. 27: Change "This indicates that they constructed internal visual representation, rather than solely picking up optic invariants." to "This indicates that they constructed internal visual representations, rather than solely picking up optic invariants."


P. 33, last line: Replace "We will referrer" with "We will refer"

Evaluation round #2

DOI or URL of the report: https://osf.io/preprints/psyarxiv/z9c28

Version of the report: 2

Author's Reply, 29 Jun 2024

Download tracked changes file

Dear recommenders, 

Thank you for the invitation to resubmit. We have now replaced the four participants that fell short of the behavoural criteria set, as well as the two participants that had missing behavioural data. All the results have been edited to reflect the replacement of the participants with behavioural data issues. The updated manuscript can be found here https://osf.io/preprints/psyarxiv/z9c28.

Thank you for the suggestions to move the perceptual experience check to a Supplementary Material section. We have done this and it can now be found in the Supplementary Material folder on the OSF page (https://osf.io/9pmrh/). 

Best, 

 

Elena

Decision by ORCID_LOGO and ORCID_LOGO, posted 14 Jun 2024, validated 14 Jun 2024

Thank you for addressing out initial comments and editing your Stage 2. Regarding the four participants which fell short of the quality check in the Stage 1, we feel it is important to replace these data to bring you to the 120 participants outlined as your target sample size in the Stage 1. This is a criterion of registered reports, and as stated in your reply, we must uphold the standard.

Thank you also for the clarification regarding the unregistered questions administered to eight of the participants, but we still feel this data should be moved to an appendix and only reference briefly in the discussion. Without preregistration, a priori hypotheses, and a sample size calculation, these analyses and their discussion should be backgrounded to the registered analyses.

We invite you to revise and resubmit and with the updated manuscript we will reach out to our four Stage 1 reviewers.

Best,

Grace & Zoltan

 

Evaluation round #1

DOI or URL of the report: https://osf.io/preprints/psyarxiv/z9c28

Version of the report: 1

Author's Reply, 12 Jun 2024

Decision by ORCID_LOGO and ORCID_LOGO, posted 07 Jun 2024, validated 07 Jun 2024

Dear Dr. Karakashevska,

Thank you for submitting your Stage 2 manuscript to PCI-RR, we’re looking forward to working with you again! The Stage 2 is well-prepared according to PCI-RR guidelines, however before we send the manuscript out for review, there are a few comments we’d like you to address. Looking forward to hearing from you soon.

Best,

Grace & Zoltan

 

The comments below are in-line with PCI-RR’s review criteria of a Stage 2: 

2A. Whether the data are able to test the authors’ proposed hypotheses (or answer the proposed research question) by passing the approved outcome-neutral criteria, such as absence of floor and ceiling effects or success of positive controls or other quality checks. For the most part you have clearly fulfilled your plans for quality checks, however the edit on Page 20 referring to the behavioral performance needs to be reinstated as it was written in the IPA: The IPA stated that “Prior to analysis of the ERP data, we will check the behavioural performance on the luminance discrimination task. Any participants whose performance is below 80% on any block will be replaced. Given the results of Karakashevska et al. (forthcoming), it is likely that most participants will be over 95% correct on this task.”. The Stage 2 currently states “Prior to analysis of the ERP data, the behavioural performance on the luminance discrimination task was checked to ensure participants were able to do the task without difficulty.” It is not clear form this edit whether you upheld their quality check of the behavioral data outlined in the IPA. It is also not clear from the results section what the variance was within the behavioral data. Did any participant perform below 80% on any block? Furthermore, with the one participant whose behavioral data was not collected, was their data included in the EEG analysis? If their data is included, how did you confirm this participant was engaging with the task in the manner you registered in the IPA? 

2B. Whether the introduction, rationale and stated hypotheses (where applicable) are the same as the approved Stage 1 submission. Aside from changes in tense (e.g. future tense to past tense), correction of typographic and grammatical errors, and correction of clear factual errors, the introduction, rationale and hypotheses of the Stage 2 submission seem to have remained identical to those in the approved Stage 1 manuscript. The one change we noted which would justify a footnote is the change in terminology when stating how the SPN will be calculated on page 7 under Study Aims and Hypotheses (and later in the methods). You previously stated the SPN would be calculated as the difference between the symmetry and random waves in the IPA Stage 1, however you now state the SPN will be calculated as the difference between thy symmetry and asymmetry waves in the Stage 2. It seems that the stimuli have not been changed from the Stage 1 to the Stage 2, therefore this is likely a terminology change. Please could you add a clarifying footnote accordingly. 

2C. Whether the authors adhered precisely to the registered study procedures. This criterion assesses compliance with protocol. Alongside the two issues mentioned in the above comments, the authors should also add a footnote to explain the edit on Page 13 under Apparatus which highlights the non-dominant eye will be covered for the monocular condition in the Stage 2. In the IPA, it was originally written that the left eye will be covered in the Stage 1. As you state under Participants on Page 9 of the IPA manuscript that the preferred sighted eye will be determined for the monocular viewing condition, this edit seems to be the correction of a mistake in the IPA. Please clarify this discrepancy in a footnote.

2D. Where applicable, whether any unregistered exploratory analyses are justified, methodologically sound, and informative. In our opinion, the exploratory analyses seem justified and address questions relevant to the unexpected findings. However, the addition of the extra behavioral experiment to assess the perceptual experience of the participants is beyond the scope of exploratory analyses. At most we suggest that you move this experiment to an appendix, and ensure you only reference it briefly in the discussion with the main conclusion following from the approved registered study.

2E. Whether the authors’ conclusions are justified given the evidence. Please see the comment above.