Recommendation

Replicating the "lure of choice" phenomenon

ORCID_LOGO based on reviews by Hu Chuan-Peng and Gakuto Chiba
A recommendation of:

Lure of choice revisited: Replication and extensions Registered Report of Bown et al. (2003)

Abstract

EN
AR
ES
FR
HI
JA
PT
RU
ZH-CN
Submission: posted 15 November 2023
Recommendation: posted 21 February 2024, validated 26 February 2024
Cite this recommendation as:
Savage, P. (2024) Replicating the "lure of choice" phenomenon. Peer Community in Registered Reports, . https://rr.peercommunityin.org/PCIRegisteredReports/articles/rec?id=595

Recommendation

The "lure of choice" refers to the idea that we prefer to preserve the option to choose even when the choice is not helpful. In a classic study cited hundred of times, Bown et al. (2003) reported evidence for the lure of choice from a series of studies involving choices between competing options of night clubs, bank savings accounts, casino spinners, and the Monty Hall door choice paradigm. In all cases, participants tended to prefer to choose an option when paired with a "lure", even when that lure was objectively inferior (e.g., same probability of winning but lower payoff).
 
The lure of choice phenomenon applies to a variety of real-life situations many of us often face in our daily lives, and Bown et al.’s findings have influenced the way organizations present choices to prospective users. Despite their theoretical and practical impact, Bown et al.'s findings have not previously been directly replicated, even as the importance of replication studies has become increasingly acknowledged (Nosek et al., 2022).
 
Here, Chan & Feldman (2024) outline a close replication of Bown et al. (2003) that will replicate and extend their original design. By unifying Bown et al.'s multiple studies into a single paradigm with which they will collect data from approximately 1,000 online participants via Prolific, they will have substantially greater statistical power than the original study to detect the predicted effects. They will follow LeBel et al.’s (2019) criteria for evaluating replicability, such that it will be considered a successful replication depending on how many of the 4 scenarios show a signal in the same direction as Bown et al.’s original results (at least 3 out of 4 scenarios = successful replication; no scenarios = failed replication; 1 or 2 scenarios = mixed results replication). They have also added additional controls including a neutral baseline choice without a lure, further ensuring the the validity and interpretability of their eventual findings.
 
One of the goals in creating Peer Community In Registered Reports (PCI RR) was to increase the availability of publishing venues for replication studies, and so PCI RR is well-suited to the proposed replication. Feldman’s lab has also pioneered the use of PCI RR for direct replications of previous studies (e.g., Zhu & Feldman, 2023), and the current submission uses an open-access template he developed (Feldman, 2023). This experience combined with PCI RR’s efficient scheduled review model meant that the current full Stage 1 protocol was able to go from initial submission, receive detailed peer review by two experts, and receive in-principle acceptance (IPA) for the revised submission, all in less than one month.
 
URL to the preregistered Stage 1 protocol: https://osf.io/8ug9m
 
Level of bias control achieved: Level 6. No part of the data or evidence that will be used to answer the research question yet exists and no part will be generated until after IPA.
 
List of eligible PCI RR-friendly journals:
 
 
References
 
Bown, N. J., Read, D. & Summers, B. (2003). The lure of choice. Journal of Behavioral Decision Making, 16(4), 297–308. https://doi.org/10.1002/bdm.447
 
Chan, A. N. Y. & Feldman, G. (2024). The lure of choice revisited: Replication and extensions Registered Report of Bown et al. (2003) [Stage 1]. In principle acceptance of Version 2 by Peer Community In Registered Reports. https://osf.io/8ug9m
 
Feldman, G. (2023). Registered Report Stage 1 manuscript template. https://doi.org/10.17605/OSF.IO/YQXTP
 
LeBel, E. P., Vanpaemel, W., Cheung, I. & Campbell, L. (2019). A brief guide to evaluate replications. Meta-Psychology, 3. https://doi.org/10.15626/MP.2018.843
 
Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., ... & Vazire, S. (2022). Replicability, robustness, and reproducibility in psychological science. Annual Review of Psychology, 73(1), 719-748. https://doi.org/10.1146/annurev-psych-020821-114157
 
Zhu, M. & Feldman, G. (2023). Revisiting the links between numeracy and decision making: Replication Registered Report of Peters et al. (2006) with an extension examining confidence. Collabra: Psychology, 9(1). https://doi.org/10.1525/collabra.77608
Conflict of interest:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

Evaluation round #1

DOI or URL of the report: https://osf.io/vytde

Version of the report: 1

Author's Reply, 21 Feb 2024

Download author's reply Download tracked changes file

Revised manuscript:  https://osf.io/f48ku

All revised materials uploaded to: https://osf.io/e47jh/ , updated manuscript under sub-directory "PCIRR Stage 1\PCI-RR submission following R&R"

Decision by ORCID_LOGO, posted 16 Feb 2024, validated 16 Feb 2024

​​This Stage 1 protocol has now been reviewed by two experts with experience in PCI-RR and open science. Both reviewers are enthusiastic about the study and recommend only relatively minor changes. I agree, and am optimistic that I could recommend In Principle Acceptance without further peer review to an appropriately revised version that addresses their concerns. In particular, please ensure that you:

1) are explicit about the limitations on generalizability from your proposed online recruitment, and

2) clarify Figs. 1-3 and the Introduction to more clearly summarize earlier in the manuscript what Bown et al.’s key “nightclub”, “bank”, and “casino” studies actually involved. (Readers should not have to wait until the Methods section to learn about these details for the first time).​

Reviewed by , 12 Feb 2024

Reviewed by ORCID_LOGO, 15 Feb 2024

PCI-RR #595 is a close replication of Bown et al (2003). I highly appreciate the authors’ effort to replicate a published paper. I am positive toward this manuscript because the rationale for choosing Bown et al (2003) as a target study to replicate is sound, the details are rich, and the materials and scripts are all open. My only suggestion is that the authors need to be cautious about the generalizability of the results to populations in non-Western cultures as their participants will be sampled from online platforms. Below are my answers to the four questions that PCI-RR recommends.
 
1A. The scientific validity of the research question(s)
My comment: The research question is scientifically valid and can be justified by the previous publication and its citations and influence in the field of decision-making.

1B. The logic, rationale, and plausibility of the proposed hypotheses (where a submission proposes hypotheses)
My comment: As a replication, the hypotheses are logically derived from the target study and remain plausible. 

1C. The soundness and feasibility of the methodology and analysis pipeline (including statistical power analysis or alternative sampling plans where applicable)
My comment: The methodology and analysis pipeline are robust and feasible, bolstered by the team’s extensive experience in replicating decision-making studies. Their previous publications and well-maintained resources provide a solid foundation for the current study. The verification of scripts using simulated data further ensures the pipeline’s feasibility.

1D. Whether the clarity and degree of methodological detail is sufficient to closely replicate the proposed study procedures and analysis pipeline and to prevent undisclosed flexibility in the procedures and analyses
My comment: Yes, all materials and scripts are open and available, they provide sufficient details for a close replication.

1E. Whether the authors have considered sufficient outcome-neutral conditions (e.g. absence of floor or ceiling effects; positive controls; other quality checks) for ensuring that the obtained results are able to test the stated hypotheses or answer the stated research question(s).
My comment: The authors added baseline conditions as extensions of the original study, which further ensures that the obtained results will effectively test the hypotheses.
 
Signed
Hu Chuan-Peng

User comments

No user comments yet