Emily C. Collins
University of Sheffield
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Emily C. Collins.
human factors in computing systems | 2017
Anna Zamansky; A. L. Roshier; Clara Mancini; Emily C. Collins; Carol Hall; Katie Grillaert; Ann Morrison; Steve North; Hanna Wirman
Animal-Computer Interaction (ACI) is a new and quickly developing discipline, which is closely related to HCI and is making reference to some of its theoretical frameworks and research methodologies. The first edition of the Workshop on Research Methods in ACI (RM4ACI) was co-located with the Third International Conference on Animal-Computer Interaction, which took place in Milton-Keynes, UK in November 2016. This paper presents an overview of the workshop, including insights from discussions on some of the challenges faced by the ACI community as it works to develop ACI as a discipline, and on important opportunities for cross-fertilization between HCI and ACI that the HCI community could consider.
conference towards autonomous robotic systems | 2015
James Law; Jonathan M. Aitken; Luke Boorman; David Cameron; Adriel Chua; Emily C. Collins; Samuel Fernando; Uriel Martinez-Hernandez; Owen McAree
In this paper we describe a novel scenario, whereby an assistive robot is required to use a lift, and results from a preliminary investigation into floor determination using readily-available information. The aim being to create an assistive robot that can naturally integrate into existing infrastructure.
advances in computer entertainment technology | 2015
Emily C. Collins; Tony J. Prescott; Ben Mitchinson; Sebastian Conran
Here we present MIRO, a companion robot designed to engage users in science and robotics via edutainment. MIRO is a robot that is biomimetic in aesthetics, morphology, behaviour, and control architecture. In this paper, we review how these design choices affect its suitability for a companionship role. In particular, we consider how MIROs emulation of familiar mammalian body language as one component of a broader biomimetic expressive system provides effective communication of emotional state and intent. We go on to discuss how these features contribute to MIROs potential in other domains such as healthcare, education, and research.
conference towards autonomous robotic systems | 2016
David Cameron; Samuel Fernando; Abigail Millings; Michael Szollosy; Emily C. Collins; Roger K. Moore; Amanda J. C. Sharkey; Tony J. Prescott
This paper explores three fundamental attributes of the Robokind Zeno-R25 (its status as person or machine, its ‘gender’, and intensity of its simulated facial expressions) and their impact on children’s perceptions of the robot, using a one-sample study design. Results from a sample of 37 children indicate that the robot is perceived as being a mix of person and machine, but also strongly as a male figure. Children could label emotions of the robot’s simulated facial-expressions but perceived intensities of these expressions varied. The findings demonstrate the importance of establishing fundamentals in user views towards social robots in supporting advanced arguments of social human-robot interaction.
european conference on mobile robots | 2015
Owen McAree; Jonathan M. Aitken; Luke Boorman; David Cameron; Adriel Chua; Emily C. Collins; Samuel Fernando; James Law; Uriel Martinez-Hernandez
Robotic assistants operating in multi-floor buildings are required to use lifts to transition between floors. To reduce the need for environments to be tailored to suit robots, and to make robot assistants more applicable, it is desirable that they should make use of existing navigational cues and interfaces designed for human users. In this paper, we examine the scenario whereby a guide robot uses a lift to transition between floors in a building. We describe an experiment into combining multiple data sources, available to a typical robot with simple sensors, to determine which floor of the building it is on. We show the robustness of this approach to realistic scenarios in a busy working environment.
conference on biomimetic and biohybrid systems | 2014
Samuel Fernando; Emily C. Collins; Armin Duff; Roger K. Moore; Paul F. M. J. Verschure; Tony J. Prescott
The Expressive Agents for Symbiotic Education and Learning (EASEL) project will explore human-robot symbiotic interaction (HRSI) with the aim of developing an understanding of symbiosis over long term tutoring interactions. The EASEL system will be built upon an established and neurobiologically grounded architecture - Distributed Adaptive Control (DAC). Here we present the design of an initial experiment in which our facially expressive humanoid robot will interact with children at a public exhibition. We discuss the range of measurements we will employ to explore the effects our robot’s expressive ability has on interaction with children during HRSI, with the aim of contributing optimal robot personality parameters to the final EASEL model.
conference on biomimetic and biohybrid systems | 2017
David Cameron; Samuel Fernando; Emily C. Collins; Abigail Millings; Michael Szollosy; Roger K. Moore; Amanda J. C. Sharkey; Tony J. Prescott
Social robots are becoming more sophisticated; in many cases they offer complex, autonomous interactions, responsive behaviors, and biomimetic appearances. These features may have significant impact on how people perceive and engage with robots; young children may be particularly influenced due to their developing ideas of agency. Young children are considered to hold naive beliefs of animacy and a tendency to mis-categorise moving objects as being alive but, with development, children can demonstrate a biological understanding of animacy. We experimentally explore the impact of children’s age and a humanoid’s movement on children’s perceptions of its animacy.
conference on biomimetic and biohybrid systems | 2016
David Cameron; Samuel Fernando; Abigail Millings; Michael Szollosy; Emily C. Collins; Roger K. Moore; Amanda J. C. Sharkey; Tony J. Prescott
The Expressive Agents for Symbiotic Education and Learning project explores human-robot symbiotic interaction with the aim to understand the development of symbiosis over long-term tutoring interactions. The final EASEL system will be built upon the neurobiologically grounded architecture - Distributed Adaptive Control. In this paper, we present the design of an interaction scenario to support development of the DAC, in the context of a synthetic tutoring assistant. Our humanoid robot, capable of life-like simulated facial expressions, will interact with children in a public setting to teach them about exercise and energy. We discuss the range of measurements used to explore children’s responses during, and experiences of, interaction with a social, expressive robot.
University of Sheffield Engineering Symposium | 2016
Jonathan M. Aitken; Owen McAree; Luke Boorman; David Cameron; Adriel Chua; Emily C. Collins; Samuel Fernando; James Law; Uriel Martinez-Hernandez
This work presents the safety and verification arguments for the development of an autonomous robot platform capable of leading humans around a building. It uses Goal Structuring Notation (GSN) to develop a pattern, a re-usable GSN fragment, that can form part of the safety case surrounding the interaction of a mobile guide robot to: record the decisions taken during the design phase, ensure safe operation around humans, and identify where mitigation must be introduced.
conference on biomimetic and biohybrid systems | 2014
Emily C. Collins; Tony J. Prescott
Contemporary robot design is influenced both by task domain (e.g., industrial manipulation versus social interaction) as well as by classification differences in humans (e.g., therapy patients versus museum visitors). As the breadth of robot use increases, we ask how will people respond to the ever increasing number of intelligent artefacts in their environment. Using the Paro robot as our case study we propose an analysis of individual differences in HRI to highlight the consequences individual characteristics have on robot performance. We discuss to what extent human-human interactions are a useful model of HRI.