Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Augusto Esteves is active.

Publication


Featured researches published by Augusto Esteves.


user interface software and technology | 2015

Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements

Augusto Esteves; Eduardo Velloso; Andreas Bulling; Hans Gellersen

We introduce Orbits, a novel gaze interaction technique that enables hands-free input on smart watches. The technique relies on moving controls to leverage the smooth pursuit movements of the eyes and detect whether and at which control the user is looking at. In Orbits, controls include targets that move in a circular trajectory in the face of the watch, and can be selected by following the desired one for a small amount of time. We conducted two user studies to assess the techniques recognition and robustness, which demonstrated how Orbits is robust against false positives triggered by natural eye movements and how it presents a hands-free, high accuracy way of interacting with smart watches using off-the-shelf devices. Finally, we developed three example interfaces built with Orbits: a music player, a notifications face plate and a missed call menu. Despite relying on moving controls -- very unusual in current HCI interfaces -- these were generally well received by participants in a third and final study.


designing interactive systems | 2016

AmbiGaze: Direct Control of Ambient Devices by Gaze

Eduardo Velloso; Markus Wirth; Christian Weichel; Augusto Esteves; Hans-Werner Gellersen

Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.


human factors in computing systems | 2015

Beats: Tapping Gestures for Smart Watches

Ian Oakley; DoYoung Lee; Md. Rasel Islam; Augusto Esteves

Interacting with smartwatches poses new challenges. Although capable of displaying complex content, their extremely small screens poorly match many of the touchscreen interaction techniques dominant on larger mobile devices. Addressing this problem, this paper presents beating gestures, a novel form of input based on pairs of simultaneous or rapidly sequential and overlapping screen taps made by the index and middle finger of one hand. Distinguished simply by their temporal sequence and relative left/right position these gestures are designed explicitly for the very small screens (approx. 40mm square) of smartwatches and to operate without interfering with regular single touch input. This paper presents the design of beating gestures and a rigorous empirical study that characterizes how users perform them -- in a mean of 355ms and with an error rate of 5.5%. We also derive thresholds for reliably distinguishing between simultaneous (under 30ms) and sequential (under 400ms) pairs of screen touches or releases. We then present five interface designs and evaluate them in a qualitative study in which users report valuing the speed and ready availability of beating gestures.


ubiquitous computing | 2016

TraceMatch: a computer vision technique for user input by tracing of animated controls

Christopher Clarke; Alessio Bellino; Augusto Esteves; Eduardo Velloso; Hans-Werner Gellersen

Recent works have explored the concept of movement correlation interfaces, in which moving objects can be selected by matching the movement of the input device to that of the desired object. Previous techniques relied on a single modality (e.g. gaze or mid-air gestures) and specific hardware to issue commands. TraceMatch is a computer vision technique that enables input by movement correlation while abstracting from any particular input modality. The technique relies only on a conventional webcam to enable users to produce matching gestures with any given body parts, even whilst holding objects. We describe an implementation of the technique for acquisition of orbiting targets, evaluate algorithm performance for different target sizes and frequencies, and demonstrate use of the technique for remote control of graphical as well as physical objects with different body parts.


ACM Transactions on Computer-Human Interaction | 2017

Motion Correlation: Selecting Objects by Matching Their Movement

Eduardo Velloso; Marcus Carter; Joshua Newn; Augusto Esteves; Christopher Clarke; Hans Gellersen

Selection is a canonical task in user interfaces, commonly supported by presenting objects for acquisition by pointing. In this article, we consider motion correlation as an alternative for selection. The principle is to represent available objects by motion in the interface, have users identify a target by mimicking its specific motion, and use the correlation between the system’s output with the user’s input to determine the selection. The resulting interaction has compelling properties, as users are guided by motion feedback, and only need to copy a presented motion. Motion correlation has been explored in earlier work but only recently begun to feature in holistic interface designs. We provide a first comprehensive review of the principle, and present an analysis of five previously published works, in which motion correlation underpinned the design of novel gaze and gesture interfaces for diverse application contexts. We derive guidelines for motion correlation algorithms, motion feedback, choice of modalities, overall design of motion correlation interfaces, and identify opportunities and challenges identified for future research and design.


international symposium on wearable computers | 2015

Orbits: enabling gaze interaction in smart watches using moving targets

Augusto Esteves; Eduardo Velloso; Andreas Bulling; Hans-Werner Gellersen

In this paper we demonstrate Orbits, a novel gaze interaction technique that accounts for both the reduced size of smart watch displays and the hands-free nature of conventional watches. Orbits combines graphical controls that display one or multiple targets moving on a circular path, with input that is provided by users as they follow any of the targets briefly with their eyes. This gaze input triggers the functionality associated with the followed target -- be it answering a call, playing a song or managing multiple notifications.


tangible and embedded interaction | 2013

Supporting offline activities on interactive surfaces

Augusto Esteves; Michelle Scott; Ian Oakley

This paper argues that inherent support for offline activities -- activities that are not sensed by the system -- is one of strongest benefits of tangible interaction over more traditional interface paradigms. By conducting two studies with single and paired users on a simple tangible tabletop scheduling application, this paper explores how tabletop interfaces could be designed to better support such offline activities. To focus its exploration, it looks at offline activities in terms of how they support cognitive work, such as aiding exploration of problem spaces or lowering task complexity. This paper concludes with insights relating to the form, size, and location for spaces that afford offline actions, and also the design of tangible tokens themselves.


user interface software and technology | 2017

SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

Augusto Esteves; David Verweij; Liza Suraiya; Rasel Islam; Youryang Lee; Ian Oakley

SmoothMoves is an interaction technique for augmented reality (AR) based on smooth pursuits head movements. It works by computing correlations between the movements of on-screen targets and the users head while tracking those targets. The paper presents three studies. The first suggests that head based input can act as an easier and more affordable surrogate for eye-based input in many smooth pursuits interface designs. A follow-up study grounds the technique in the domain of augmented reality, and captures the error rates and acquisition times on different types of AR devices: head-mounted (2.6%, 1965ms) and hand-held (4.9%, 2089ms). Finally, the paper presents an interactive lighting system prototype that demonstrates the benefits of using smooth pursuits head movements in interaction with AR interfaces. A final qualitative study reports on positive feedback regarding the techniques suitability for this scenario. Together, these results indicate show SmoothMoves is viable, efficient and immediately available for a wide range of wearable devices that feature embedded motion sensing.


human factors in computing systems | 2011

Informing design by recording tangible interaction

Augusto Esteves; Ian Oakley

Evaluating tangible user interfaces is challenging. Despite the wealth of research describing the design of tangible systems, there is little empirical evidence highlighting the benefits they can confer. This paper presents a toolkit that logs the manipulation of tangible objects as a step towards creating specific empirical methods for the study of tangible systems. The paper argues that the data derived from toolkit can be used in three ways. Firstly: to compare tangible interaction with other interaction paradigms. Secondly: to compare among different tangible interfaces performing the same tasks. Thirdly: via integration into a structured design process. This paper focuses on this last topic and discusses how detailed data regarding object use the data could be integrated into classifications and frameworks such as the Shaers et als TAC paradigm.


Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies | 2017

Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch

Christopher Clarke; Alessio Bellino; Augusto Esteves; Hans-Werner Gellersen

In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface displays available controls as circular widgets with orbiting targets, and where users can trigger a control by mimicking the displayed motion. The technique uses computer vision to detect circular motion as a uniform type of input, but is highly appropriable as users can produce matching motion with any part of their body. We present three studies that investigate input performance with different parts of the body, user preferences, and spontaneous choice of movements for input in realistic application scenarios. The results show that users can provide effective input with their head, hands and while holding objects, that multiple controls can be effectively distinguished by the difference in presented phase and direction of movement, and that users choose and switch modes of input seamlessly.

Collaboration


Dive into the Augusto Esteves's collaboration.

Top Co-Authors

Avatar

Ian Oakley

Ulsan National Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

S Saskia Bakker

Eindhoven University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Verweij

Eindhoven University of Technology

View shared research outputs
Top Co-Authors

Avatar

Filipe Quintal

Madeira Interactive Technologies Institute

View shared research outputs
Top Co-Authors

Avatar

Vassilis-Javed Khan

Eindhoven University of Technology

View shared research outputs
Top Co-Authors

Avatar

Joshua Newn

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge