David Ledo
University of Calgary
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David Ledo.
human computer interaction with mobile devices and services | 2012
Sebastian Boring; David Ledo; Xiang ‘Anthony’ Chen; Nicolai Marquardt; Anthony Tang; Saul Greenberg
Modern mobile devices allow a rich set of multi-finger interactions that combine modes into a single fluid act, for example, one finger for panning blending into a two-finger pinch gesture for zooming. Such gestures require the use of both hands: one holding the device while the other is interacting. While on the go, however, only one hand may be available to both hold the device and interact with it. This mostly limits interaction to a single-touch (i.e., the thumb), forcing users to switch between input modes explicitly. In this paper, we contribute the Fat Thumb interaction technique, which uses the thumbs contact size as a form of simulated pressure. This adds a degree of freedom, which can be used, for example, to integrate panning and zooming into a single interaction. Contact size determines the mode (i.e., panning with a small size, zooming with a large one), while thumb movement performs the selected mode. We discuss nuances of the Fat Thumb based on the thumbs limited operational range and motor skills when that hand holds the device. We compared Fat Thumb to three alternative techniques, where people had to precisely pan and zoom to a predefined region on a map and found that the Fat Thumb technique compared well to existing techniques.
interactive tabletops and surfaces | 2011
Nicolai Marquardt; Johannes Kiemer; David Ledo; Sebastian Boring; Saul Greenberg
Recent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which part of the hand, (2) which side of the hand, and (3) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its low-level programming model hinders the way developers could rapidly explore new kinds of user- and handpart-aware interactions. We contribute the TouchID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TouchID provides an easy-to-use event-driven API as well as higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TouchIDs expressiveness by showing how we developed a suite of techniques that exploits knowledge of which handpart is touching the surface.
tangible and embedded interaction | 2012
David Ledo; Miguel A. Nacenta; Nicolai Marquardt; Sebastian Boring; Saul Greenberg
In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the Haptictouch toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.
human computer interaction with mobile devices and services | 2015
David Ledo; Saul Greenberg; Nicolai Marquardt; Sebastian Boring
Remote controls facilitate interactions at-a-distance with appliances. However, the complexity, diversity, and increasing number of digital appliances in ubiquitous computing ecologies make it increasingly difficult to: (1) discover which appliances are controllable; (2) select a particular appliance from the large number available; (3) view information about its status; and (4) control the appliance in a pertinent manner. To mitigate these problems we contribute proxemic-aware controls, which exploit the spatial relationships between a persons handheld device and all surrounding appliances to create a dynamic appliance control interface. Specifically, a person can discover and select an appliance by the way one orients a mobile device around the room, and then progressively view the appliances status and control its features in increasing detail by simply moving towards it. We illustrate proxemic-aware controls of assorted appliances through various scenarios. We then provide a generalized conceptual framework that informs future designs of proxemic-aware controls.
human factors in computing systems | 2015
Bon Adriel Aseniero; Tiffany Wun; David Ledo; Guenther Ruhe; Anthony Tang; Sheelagh Carpendale
Software is typically developed incrementally and released in stages. Planning these releases involves deciding which features of the system should be implemented for each release. This is a complex planning process involving numerous trade-offs-constraints and factors that often make decisions difficult. Since the success of a product depends on this plan, it is important to understand the trade-offs between different release plans in order to make an informed choice. We present STRATOS, a tool that simultaneously visualizes several software release plans. The visualization shows several attributes about each plan that are important to planners. Multiple plans are shown in a single layout to help planners find and understand the trade-offs between alternative plans. We evaluated our tool via a qualitative study and found that STRATOS enables a range of decision-making processes, helping participants decide on which plan is most optimal.
human factors in computing systems | 2013
David Ledo; Bon Adriel Aseniero; Saul Greenberg; Sebastian Boring; Anthony Tang
Video conferencing commonly employs a video portal metaphor to connect individuals from remote spaces. In this work, we explore an alternate metaphor, a shared depth-mirror, where video images of two spaces are fused into a single shared, depth-corrected video space. We realize this metaphor in OneSpace, where the space respects virtual spatial relationships between people and objects as if all parties were looking at a mirror together. We report preliminary observations of OneSpaces use, noting that it encourages cross-site, full-body interactions, and that participants employed the depth cues in their interactions. Based on these observations, we argue that the depth mirror offers new opportunities for shared video interaction.
human factors in computing systems | 2013
David Ledo; Saul Greenberg
Computing technologies continue to grow exponentially every day. However, appliances have become a class of technology that has remained stagnant through time. They are restricted by physical and cost limitations, while also aiming to provide with a lot of functionality. This leads to limited capabilities of input (through multiple buttons and combinations) and output (LEDs, small screens). We introduce the notion of mobile proxemic awareness and control, whereby a mobile device is used as a medium to reveal of information regarding awareness of presence, state, content and control as a function of proxemics. We explore a set of concepts that exploit different proximal distances and levels of information and controls. We illustrate the concepts with two deliberately simple prototypes: a lamp and a radio alarm clock.
Archive | 2014
Frederik Brudy; David Ledo; Saul Greenberg
When a person interacts with a display in an open area, sensitive information becomes visible to shouldersurfing passersby. While a person’s body shields small displays, shielding is less effective as display area increases. To mitigate this problem, we sense spatial relationships between the passerby, person and display. Awareness of onlookers is provided through visual cues: flashing screen borders, a 3D model mirroring the onlooker’s position and gaze, and an indicator that illustrates their gaze direction. The person can react with a gesture that commands the display to black out personal windows, or to collect them on one side. Alternately, the display will automatically darken screen regions visible by the onlooker, but leaving the display area shielded by the person’s body unaltered (thus allowing the person to continue their actions). The person can also invite the onlooker to collaborate with them via a gesture that reverses these protective mechanisms.
user interface software and technology | 2018
David Ledo
As interactions move beyond the desktop, interactive behaviours (effects of actions as they happen, or once they happen) are becoming increasingly complex. This complexity is due to the variety of forms that objects might take, and the different inputs and sensors capturing information, and the ability to create nuanced responses to those inputs. Current interaction design tools do not support much of this rich behaviour authoring. In my work I create prototyping tools that examine ways in which designers can create interactive behaviours. Thus far, I have created two prototyping tools: Pineal and Astral, which examine how to create physical forms based on a smart objects behaviour, and how to reuse existing desktop infrastructures to author different kinds of interactive behaviour. I also contribute conceptual elements, such as how to create smart objects using mobile devices, their sensors and outputs, instead of using custom electronic circuits, as well as devising evaluation strategies used in HCI toolkit research which directly informs my approach to evaluating my tools.
Archive | 2015
David Ledo
........................................................................................... Publications ....................................................................................... Acknowledgements ........................................................................... Video Figures .................................................................................. v Disclaimer ........................................................................................ Table of