Proceedings of the 1st International Conference on Conversational User Interfaces | 2019

From sex and therapy bots to virtual assistants and tutors: how emotional should artificially intelligent agents be?

 

Abstract


The question of whether intelligent agents should have an emotional capacity has been rehearsed for over 20 years. In that time moving in an affirming direction from should we? to how will we? . Less clear however, is process for developing emotional systems: how do we characterise levels of emotion; how do we relate emotion to an agent s intended function; and who should make these decisions about emotional sufficiency? Categorising the discussion in establishing emotional detection, emotional intelligence, the ability to emote and generate feelings provides a basic structure against which to consider how central developers and engineers are in the decision making about emotional sufficiency via conversational interfaces, and further it is essential in empowering this discussion with a wider community in understanding use (and potential misuse) of emotional capacity in AI.

Volume None
Pages None
DOI 10.1145/3342775.3342807
Language English
Journal Proceedings of the 1st International Conference on Conversational User Interfaces

Full Text