Why Are People's Decisions Sometimes Worse with Computer Support?
Alberdi, E., Strigini, L., Povyakalo, A. A. & Ayton, P. (2009). Why Are People's Decisions Sometimes Worse with Computer Support?. COMPUTER SAFETY, RELIABILITY, AND SECURITY, PROCEEDINGS, 5775, pp. 18-31. doi: 10.1007/978-3-642-04468-7_3
Abstract
In many applications of computerised decision support, a recognised source of undesired outcomes is operators' apparent over-reliance on automation. For instance, an operator may fail to react to a potentially dangerous situation because a computer fails to generate an alarm. However, the very use of terms like "over-reliance" betrays possible misunderstandings of these phenomena and their causes, which may lead to ineffective corrective action (e.g. training or procedures that do not counteract all the causes of the apparently "over-reliant" behaviour). We review relevant literature in the area of "automation bias" and describe the diverse mechanisms that may be involved in human errors when using computer support. We discuss these mechanisms, with reference to errors of omission when using "alerting systems", with the help of examples of novel counterintuitive findings we obtained from a case study in a health care application, as well as other examples from the literature.
Publication Type: | Article |
---|---|
Publisher Keywords: | decision support, computer aided decision making, alerting systems, human-machine diversity, omission errors, AUTOMATION BIAS, HUMAN INTERVENTION, SELF-CONFIDENCE, DETECTION TASK, TRUST, SYSTEMS, AIDS, MAMMOGRAPHY, WARNINGS, HUMANS |
Subjects: | B Philosophy. Psychology. Religion > BF Psychology Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Departments: | School of Health & Psychological Sciences > Psychology School of Science & Technology > Computer Science > Software Reliability |
Download (670kB) | Preview
Export
Downloads
Downloads per month over past year