Loading...
Thumbnail Image
Item

Automation error bias, trust, and dependence behaviors in a simulated drone collision avoidance task

Jackson, Austin
Sato, Tetsuya
Glassman, Jeffrey
Politowicz, Michael S.
Chancey, Eric T.
Yamani, Yusuke
Citations
Google Scholar:
Altmetric:
Other Names
Location
Time Period
Advisors
Original Date
Digitization Date
Issue Date
2026
Type
Article
Genre
Keywords
Advanced air mobility,Automation dependency,Compliance,Human-automation interaction,Reliability,Reliance,Trust in automation,Warning systems
Subjects (LCSH)
Research Projects
Organizational Units
Journal Issue
Citation
Jackson, A., Sato, T., Glassman, J., Politowicz, M. S., Chancey, E. T., & Yamani, Y. (2026). Automation Error Bias, Trust, and Dependence Behaviors in a Simulated Drone Collision Avoidance Task. Human Factors: The Journal of the Human Factors and Ergonomics Society, 0(0).
Abstract
Objective: This experiment examined how error biases of an imperfect automated decision aid system impacted trust and dependency behaviors in a simulated drone collision avoidance task. Background: Prior work on human-automation interaction indicates asymmetrical effects of error biases, misses, and false alarms, on compliance and reliance. Yet, it is unclear whether the effect is due to unbalanced perceptual salience of the automation errors or their trust toward the automated system. Method: Sixty-eight participants interacted with a drone monitoring task with the assistance of a collision avoidance aid that varied in error bias (i.e., miss-prone and false-alarm prone). Participants’ automation trust ratings and dependency behaviors (i.e., compliance and reliance) were measured. Results: With error biases equally salient, participants showed a similar decrease in levels of trust along multiple factors of automation trust when interacting with unreliable automation aids. Compliance rates were higher when interacting with a miss-prone system than a false-alarm prone system, whereas reliance rates showed the opposite pattern. Conclusion: Error bias determines compliance and reliance behaviors systematically. Saliency-matched false alarm and miss errors by automation degrade trust, potentially undermining the development of performance-based trust. Application: Designers of automated systems should consider how different error types systematically affect dependency behaviors to create transparent systems that properly calibrate trust to the capability of the automation. © 2026 Human Factors and Ergonomics Society
Table of Contents
Description
Click on the DOI link to access this article at the publishers website (may not be free).
Publisher
SAGE Publications Inc.
Journal
Human Factors
Book Title
Series
Digital Collection
Finding Aid URL
Use and Reproduction
Archival Collection
PubMed ID
ISSN
00187208
EISSN
Embedded videos