Abstract
Automation reliability and transparency are key factors for trust calibration and as such can have distinct effects on human reliance behaviour and mission performance. One question that remains unexplored is: what are the implications of reliability and transparency on trust calibration for human-swarm interaction? We investigate this research question in the context of human-swarm interaction, as swarm systems are becoming more popular for their robustness and versatility. Thirty-two participants performed swarm-based tasks under different reliability and transparency conditions. The results indicate that trust, whether it is reliability- or transparency-based, indicates high reliance rates and shorter response times. Reliability-based trust is negatively correlated with correct rejection rates while transparency-based trust is positively correlated with these rates. We conclude that reliability and transparency have distinct effects on trust calibration. Practitioner Summary: Reliability and transparency have distinct effects on trust calibration. Findings from our human experiments suggest that transparency is a necessary design requirement if and when humans need to be involved in the decision-loop of human-swarm systems, especially when swarm reliability is high. Abbreviations: HRI: human-robot interaction; IOS: inter-organisational systems; LMM: liner mixed models; MANOVA: multivariate analysis of variance; UxV: heterogeneous unmanned vehicles; UAV: unmanned aerial vehicle.
Original language | English |
---|---|
Pages (from-to) | 1116-1132 |
Number of pages | 17 |
Journal | Ergonomics |
Volume | 63 |
Issue number | 9 |
DOIs | |
Publication status | Published - 1 Sept 2020 |
Externally published | Yes |