TY - JOUR
T1 - Ordering effects in discrete choice experiments: a systematic literature review across domains
AU - Boxebeld, Sander
N1 - Publisher Copyright: © 2024
PY - 2024/6
Y1 - 2024/6
N2 - Discrete choice experiments (DCEs) are increasingly used in several scientific domains. Since their results may be used to inform governmental decision-making, it is important that the validity of the method is continuously scrutinized. An often-studied design artefact is the impact of the presentation order of alternatives, attributes, and choice sets on the results of a DCE. No systematic review of the literature on ordering effects existed until now, and many applied studies using a DCE do not explicitly consider the role of ordering effects. I conducted a systematic review of the literature on ordering effects in this study. Using a three-step snowball sampling strategy, 85 studies were identified across various scientific domains. The majority of included studies documented statistically significant ordering effects. Alternative and attribute ordering effects are primarily caused by lexicographic behaviours, while choice set ordering effects seem to be caused by respondent learning, fatigue, or anchoring. Although ordering effects may not always occur, the majority of studies that did find statistically significant effects warrants the use of mitigation methods. An overview of potential mitigation methods for the applied DCE literature is presented, including randomization of presentation orders, advance disclosure of DCE core elements, and inclusion of alternative-specific constants (ASCs), attribute level overlap, and an instructional choice set (ICS). Finally, several directions for future methodological research on this topic are provided, particularly regarding heterogeneity in ordering effects by study design traits and respondent characteristics, and interactions between ordering effects. Insights in these aspects would further our understanding of respondents’ processing of DCEs.
AB - Discrete choice experiments (DCEs) are increasingly used in several scientific domains. Since their results may be used to inform governmental decision-making, it is important that the validity of the method is continuously scrutinized. An often-studied design artefact is the impact of the presentation order of alternatives, attributes, and choice sets on the results of a DCE. No systematic review of the literature on ordering effects existed until now, and many applied studies using a DCE do not explicitly consider the role of ordering effects. I conducted a systematic review of the literature on ordering effects in this study. Using a three-step snowball sampling strategy, 85 studies were identified across various scientific domains. The majority of included studies documented statistically significant ordering effects. Alternative and attribute ordering effects are primarily caused by lexicographic behaviours, while choice set ordering effects seem to be caused by respondent learning, fatigue, or anchoring. Although ordering effects may not always occur, the majority of studies that did find statistically significant effects warrants the use of mitigation methods. An overview of potential mitigation methods for the applied DCE literature is presented, including randomization of presentation orders, advance disclosure of DCE core elements, and inclusion of alternative-specific constants (ASCs), attribute level overlap, and an instructional choice set (ICS). Finally, several directions for future methodological research on this topic are provided, particularly regarding heterogeneity in ordering effects by study design traits and respondent characteristics, and interactions between ordering effects. Insights in these aspects would further our understanding of respondents’ processing of DCEs.
UR - http://www.scopus.com/inward/record.url?scp=85190446640&partnerID=8YFLogxK
U2 - 10.1016/j.jocm.2024.100489
DO - 10.1016/j.jocm.2024.100489
M3 - Review article
AN - SCOPUS:85190446640
SN - 1755-5345
VL - 51
JO - Journal of Choice Modelling
JF - Journal of Choice Modelling
M1 - 100489
ER -