City Research Online

Beware of botshit: How to manage the epistemic risks of generative chatbots

Hannigan, T. R., McCarthy, I. P. & Spicer, A. (2024). Beware of botshit: How to manage the epistemic risks of generative chatbots. Business Horizons, 67(5), pp. 471-486. doi: 10.1016/j.bushor.2024.03.001

Abstract

Advances in large language model (LLM) technology enable chatbots to generate and analyze content for our work. Generative chatbots do this work by predicting responses rather than knowing the meaning of their responses. In other words, chatbots can produce coherent-sounding but inaccurate or fabricated content, referred to as hallucinations. When humans uncritically use this untruthful content, it becomes what we call botshit. This article focuses on how to use chatbots for content generation work while mitigating the epistemic (i.e., the process of producing knowledge) risks associated with botshit. Drawing on risk management research, we introduce a typology framework that orients how chatbots can be used based on two dimensions: response veracity verifiability and response veracity importance. The framework identifies four modes of chatbot work (authenticated, autonomous, automated, and augmented) with a botshit-related risk (ignorance, miscalibration, routinization, and black boxing). We describe and illustrate each mode and offer advice to help chatbot users guard against the botshit risks that come with each mode.

Publication Type: Article
Additional Information: © 2024. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/
Publisher Keywords: Chatbots, Bullshit, Botshit, Artificial intelligence, Natural language processing
Subjects: H Social Sciences > HN Social history and conditions. Social problems. Social reform
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: Bayes Business School
SWORD Depositor:
[thumbnail of beware-of-botshit-how-to-manage-the-epistemic-risks (1).pdf] Text - Accepted Version
This document is not freely accessible until 20 March 2027 due to copyright restrictions.
Available under License Creative Commons Attribution Non-commercial No Derivatives.

To request a copy, please use the button below.

Request a copy

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login