Alix Goguey
University of Saskatchewan
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alix Goguey.
user interface software and technology | 2017
Damien Masson; Alix Goguey; Sylvain Malacria; Géry Casiez
HCI researchers lack low latency and robust systems to support the design and development of interaction techniques using finger identification. We developed a low cost prototype using piezo based vibration sensors attached to each finger. By combining the events from an input device with the information from the vibration sensors we demonstrate how to achieve low latency and robust finger identification. Our prototype was evaluated in a controlled experiment, using two keyboards and a touchpad, showing recognition rates of 98.2% for the keyboard and, for the touchpad, 99.7% for single touch and 94.7% for two simultaneous touches. These results were confirmed in an additional laboratory style experiment with ecologically valid tasks. Last we present new interactions techniques made possible using this technology.
user interface software and technology | 2014
Alix Goguey; Géry Casiez; Daniel Vogel; Fanny Chevalier; Thomas Pietrzak; Nicolas Roussel
Identifying which fingers are in contact with a multi-touch surface provides a very large input space that can be leveraged for command selection. However, the numerous possibilities enabled by such vast space come at the cost of discoverability. To alleviate this problem, we introduce a three-step interaction pattern inspired by hotkeys that also supports feed-forward. We illustrate this interaction with three applications allowing us to explore and adapt it in different contexts.
International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2017
Alix Goguey; Daniel Vogel; Fanny Chevalier; Thomas Pietrzak; Nicolas Roussel; Géry Casiez
Identifying which fingers are touching a multi-touch surface provides a very large input space. We describe FingerCuts, an interaction technique inspired by desktop keyboard shortcuts to exploit this potential. FingerCuts enables integrated command selection and parameter manipulation, it uses feed-forward and feedback to increase discoverability, it is backward compatible with current touch input techniques, and it is adaptable for different touch device form factors. We implemented three variations of FingerCuts, each tailored to a different device form factor: tabletop, tablet, and smartphone. Qualitative and quantitative studies conducted on the tabletop suggests that with some practice, FingerCuts is expressive, easy-to-use, and increases a sense of continuous interaction flow and that interaction with FingerCuts is as fast, or faster than using a graphical user interface. A theoretical analysis of FingerCuts using the Fingerstroke-Level Model (FLM) matches our quantitative study results, justifying our use of FLM to analyse and validate the performance for the other device form factors.
l'interaction homme-machine | 2014
Alix Goguey; Géry Casiez; Thomas Pietrzak; Daniel Vogel; Nicolas Roussel
Hotkeys are a critical factor of performance for expert users in WIMP interfaces. Multi-touch interfaces, by contrast, do not provide such efficient command shortcuts. We propose Adoiraccourcix, which leverage finger identification to introduce quick command invocation integrated with direct manipulation in this context. After presenting the concept behind, we illustrated Adoiraccourcix in a vectorial drawing application and ran preliminary user studies comparing Adoiraccourcix to classical user interfaces. Results suggest that once mastered, Adoiraccourcix provides very powerful means of interaction.
symposium on spatial user interaction | 2017
Md. Sami Uddin; Carl Gutwin; Alix Goguey
Linear interface controllers such as sliders and scrollbars are primary tools for navigating through linear content such as videos or text documents. Linear control widgets provide an abstract representation of the entire document in the body of the widget, in that they map each document location to a different position of the slider knob or scroll thumb. In most cases, however, these linear mappings are visually undifferentiated - all locations in the widget look the same - and so it can be difficult to build up spatial knowledge of the document, and difficult to navigate back to locations that the user has already visited. In this paper, we examine a technique that can address this problem: artificial landmarks that are added to a linear control widget in order to improve spatial understanding and revisitation. We carried out a study with two types of content (a video, and a PDF document) to test the effects of adding artificial landmarks. We compared standard widgets (with no landmarks) to two augmented designs: one that placed arbitrary abstract icons in the body of the widget, and one that added thumbnails extracted from the document. We found that for both kinds of content, adding artificial landmarks significantly improved revisitation performance and user preference, with the thumbnail landmarks fastest and most accurate in both cases. Our study demonstrates that augmenting linear control widgets with artificial landmarks can provide substantial benefits for document navigation.
human factors in computing systems | 2018
Alix Goguey; Géry Casiez; Daniel Vogel; Carl Gutwin
Atomic interactions in touch interfaces, like tap, drag, and flick, are well understood in terms of interaction design, but less is known about their physical performance characteristics. We carried out a study to gather baseline data about finger pitch and roll orientation during atomic touch input actions. Our results show differences in orientation and range for different fingers, hands, and actions, and we analyse the effect of tablet angle. Our data provides designers and researchers with a new resource to better understand what interactions are possible in different settings ( e.g. when using the left or right hand), to design novel interaction techniques that use orientation as input (e.g. using finger tilt as an implicit mode), and to determine whether new sensing techniques are feasible (e.g. using fingerprints for identifying specific finger touches).
symposium on spatial user interaction | 2017
Md. Sami Uddin; Carl Gutwin; Alix Goguey
Revisiting locations within a linear document (e.g., video or PDF) is a very common yet important task that requires both recall and precision from user. Modern linear control widgets (e.g., sliders and scrollbars) provide various navigational features including an abstract mapping of the entire document in the body of the widget. These linear mappings, however, are visually undifferentiated and often make revisitation task difficult. We present two artificial landmarks augmented designs: one that placed arbitrary abstract icons in the body of the widget, and one that added thumbnails extracted from the document and tested those in two types of content (a video, and a PDF document). Our findings demonstrate that augmenting linear control widgets with artificial landmarks can provide substantial benefits for document revisitation and navigation.
international conference on human-computer interaction | 2015
Alix Goguey; Julie Wagner; Géry Casiez
In spite of previous work showing the importance of understanding users’ strategies when performing tasks, i.e. the order in which users perform actions on objects using commands, HCI researchers evaluating and comparing interaction techniques remain mainly focused on performance (e.g. time, error rate). This can be explained to some extent by the difficulty to characterize such strategies.We propose metrics to quantify if an interaction technique introduces a rather object- or command-oriented task strategy, depending if users favor completing the actions on an object before moving to the next one or in contrast if they are reluctant to switch between commands. On an interactive surface, we compared Fixed Palette and Toolglass with two novel techniques that take advantage of finger identification technology, Fixed Palette using Finger Identification and Finger Palette. We evaluated our metrics with previous results on both existing techniques. With the novel techniques we found that (1) minimizing the required physical movement to switch tools does not necessarily lead to more object-oriented strategies and (2) increased cognitive load to access commands can lead to command-oriented strategies.
International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2019
Andy Cockburn; D. Masson; Carl Gutwin; Philippe A. Palanque; Alix Goguey; Marcus Yung; C. Gris; Catherine Trask
Abstract Incorporating touchscreen interaction into cockpit flight systems offers several potential advantages to aircraft manufacturers, airlines, and pilots. However, vibration and turbulence are challenges to reliable interaction. We examine the design space for braced touch interaction, which allows users to mechanically stabilise selections by bracing multiple fingers on the touchscreen before completing selection. Our goal is to enable fast and accurate target selection during high levels of vibration, without impeding interaction performance when vibration is absent. Three variant methods of braced touch are evaluated, using doubletap, dwell, or a force threshold in combination with heuristic selection criteria to discriminate intentional selection from concurrent braced contacts. We carried out an experiment to test the performance of these methods in both abstract selection tasks and more realistic flight tasks. The study results confirm that bracing improves performance during vibration, and show that doubletap was the best of the tested methods.
human factors in computing systems | 2018
Alix Goguey; Géry Casiez; Andy Cockburn; Carl Gutwin
Touch interactions are now ubiquitous, but few tools are available to help designers quickly prototype touch interfaces and predict their performance. For rapid prototyping, most applications only support visual design. For predictive modelling, tools such as CogTool generate performance predictions but do not represent touch actions natively and do not allow exploration of different usage contexts. To combine the benefits of rapid visual design tools with underlying predictive models, we developed the Storyboard Empirical Modelling tool (StEM) for exploring and predicting user performance with touch interfaces. StEM provides performance models for mainstream touch actions, based on a large corpus of realistic data. Our tool provides new capabilities for exploring and predicting touch performance, even in the early stages of design. This is the demonstration of our accompanying paper1.