Lie for a Dime: When most prescreening responses are honest but most study participants are imposters.

J Chandler, Gabriele Paolacci

Research output: Contribution to journalArticleAcademicpeer-review

126 Citations (Scopus)

Abstract

The Internet has enabled recruitment of large samples with specific characteristics. However, when researchers rely on participant self-report to determine eligibility, data quality depends on participant honesty. Across four studies on Amazon Mechanical Turk, we show that a substantial number of participants misrepresent theoretically relevant characteristics (e.g., demographics, product ownership) to meet eligibility criteria explicit in the studies, inferred by a previous exclusion from the study or inferred in previous experiences with similar studies. When recruiting rare populations, a large proportion of responses can be impostors. We provide recommendations about how to ensure that ineligible participants are excluded that are applicable to a wide variety of data collection efforts, which rely on self-report.
Original languageEnglish
Pages (from-to)500-508
Number of pages9
JournalSocial Psychological and Personality Science
Volume8
Issue number5
DOIs
Publication statusPublished - 2017

Fingerprint

Dive into the research topics of 'Lie for a Dime: When most prescreening responses are honest but most study participants are imposters.'. Together they form a unique fingerprint.

Cite this