The ChatGPT Artificial Intelligence Chatbot: How Well Does It Answer Accounting Assessment Questions?

David A. Wood, Muskan P. Achhpilia, Mollie T. Adams, Maximilian Margolin, et al.

Research output: Contribution to journalArticleAcademicpeer-review

41 Citations (Scopus)
43 Downloads (Pure)

Abstract

ChatGPT, a language-learning model chatbot, has garnered considerable attention for its ability to respond to users’ questions. Using data from 14 countries and 186 institutions, we compare ChatGPT and student performance for 28,085 questions from accounting assessments and textbook test banks. As of January 2023, ChatGPT provides correct answers for 56.5 percent of questions and partially correct answers for an additional 9.4 percent of questions. When considering point values for questions, students significantly outperform ChatGPT with a 76.7 percent average on assessments compared to 47.5 percent for ChatGPT if no partial credit is awarded and 56.5 percent if partial credit is awarded. Still, ChatGPT performs better than the student average for 15.8 percent of assessments when we include partial credit. We provide evidence of how ChatGPT performs on different question types, accounting topics, class levels, open/closed assessments, and test bank questions. We also discuss implications for accounting education and research.
Original languageEnglish
Pages (from-to)81-108
Number of pages28
JournalIssues in Accounting Education
Volume38
Issue number4
DOIs
Publication statusPublished - 2023

Bibliographical note

Publisher Copyright:
© 2023, American Accounting Association. All rights reserved.

Fingerprint

Dive into the research topics of 'The ChatGPT Artificial Intelligence Chatbot: How Well Does It Answer Accounting Assessment Questions?'. Together they form a unique fingerprint.

Cite this