Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2019

Generational differences in trust in digital assistants

 
 

Abstract


Human trust in automation has been studied extensively within safety critical domains (military, aviation, process control, etc.) because harmful consequences are associated with the improper calibration of trust in automated systems in these domains (Parasuraman & Riley, 1997). As such, researchers have worked to identify important factors which help humans build trust in such systems (Hoff & Bashir, 2015). With the explosion of AI in consumer technologies, it is becoming equally critical to understand how humans interact with everyday devices. This study investigated how factors that have been identified to impact trust in automation in safety critical domains influence the trust and use of popular digital assistants (Siri, Cortana, Bixby or Google Now). We conducted an online survey with 278 regular users of digital assistants across three generations (GenX, GenY, and GenZ). The results demonstrate that, even after controlling for dispositional factors (i.e., individual characteristics such as age, culture, gender), GenZ exhibited higher trust in digital assistants than GenX. More interestingly, linear regression analyses revealed a different set of predictors of trust for each generation. Results from this survey have implications for the design of digital assistants.

Volume 63
Pages 206 - 210
DOI 10.1177/1071181319631029
Language English
Journal Proceedings of the Human Factors and Ergonomics Society Annual Meeting

Full Text