Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alex Olwal is active.

Publication


Featured researches published by Alex Olwal.


user interface software and technology | 2013

inFORM: dynamic physical affordances and constraints through shape and object actuation

Sean Follmer; Daniel Leithinger; Alex Olwal; Akimitsu Hogge; Hiroshi Ishii

Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.


user interface software and technology | 2012

Jamming user interfaces: programmable particle stiffness and sensing for malleable and shape-changing devices

Sean Follmer; Daniel Leithinger; Alex Olwal; Nadia Cheng; Hiroshi Ishii

Malleable and organic user interfaces have the potential to enable radically new forms of interactions and expressiveness through flexible, free-form and computationally controlled shapes and displays. This work, specifically focuses on particle jamming as a simple, effective method for flexible, shape-changing user interfaces where programmatic control of material stiffness enables haptic feedback, deformation, tunable affordances and control gain. We introduce a compact, low-power pneumatic jamming system suitable for mobile devices, and a new hydraulic-based technique with fast, silent actuation and optical shape sensing. We enable jamming structures to sense input and function as interaction devices through two contributed methods for high-resolution shape sensing using: 1) index-matched particles and fluids, and 2) capacitive and electric field sensing. We explore the design space of malleable and organic user interfaces enabled by jamming through four motivational prototypes that highlight jammings potential in HCI, including applications for tabletops, tablets and for portable shape-changing mobile devices.


user interface software and technology | 2014

Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration

Daniel Leithinger; Sean Follmer; Alex Olwal; Hiroshi Ishii

We propose a new approach to Physical Telepresence, based on shared workspaces with the ability to capture and remotely render the shapes of people and objects. In this paper, we describe the concept of shape transmission, and propose interaction techniques to manipulate remote physical objects and physical renderings of shared digital content. We investigate how the representation of users body parts can be altered to amplify their capabilities for teleoperation. We also describe the details of building and testing prototype Physical Telepresence workspaces based on shape displays. A preliminary evaluation shows how users are able to manipulate remote objects, and we report on our observations of several different manipulation techniques that highlight the expressive nature of our system.


human factors in computing systems | 2013

Sublimate: state-changing virtual and physical rendering to augment interaction with shape displays

Daniel Leithinger; Sean Follmer; Alex Olwal; Samuel Luescher; Akimitsu Hogge; Jinha Lee; Hiroshi Ishii

Recent research in 3D user interfaces pushes towards immersive graphics and actuated shape displays. Our work explores the hybrid of these directions, and we introduce sublimation and deposition, as metaphors for the transitions between physical and virtual states. We discuss how digital models, handles and controls can be interacted with as virtual 3D graphics or dynamic physical shapes, and how user interfaces can rapidly and fluidly switch between those representations. To explore this space, we developed two systems that integrate actuated shape displays and augmented reality (AR) for co-located physical shapes and 3D graphics. Our spatial optical see-through display provides a single user with head-tracked stereoscopic augmentation, whereas our handheld devices enable multi-user interaction through video seethrough AR. We describe interaction techniques and applications that explore 3D interaction for these new modalities. We conclude by discussing the results from a user study that show how freehand interaction with physical shape displays and co-located graphics can outperform wand-based interaction with virtual 3D graphics.


user interface software and technology | 2011

SpeckleSense: fast, precise, low-cost and compact motion sensing using laser speckle

Jan Zizka; Alex Olwal; Ramesh Raskar

Motion sensing is of fundamental importance for user interfaces and input devices. In applications, where optical sensing is preferred, traditional camera-based approaches can be prohibitive due to limited resolution, low frame rates and the required computational power for image processing. We introduce a novel set of motion-sensing configurations based on laser speckle sensing that are particularly suitable for human-computer interaction. The underlying principles allow these configurations to be fast, precise, extremely compact and low cost. We provide an overview and design guidelines for laser speckle sensing for user interaction and introduce four general speckle projector/sensor configurations. We describe a set of prototypes and applications that demonstrate the versatility of our laser speckle sensing techniques.


user interface software and technology | 2016

proCover: Sensory Augmentation of Prosthetic Limbs Using Smart Textile Covers

Joanne Leong; Patrick Parzer; Florian Perteneder; Teo Babic; Christian Rendl; Anita Vogl; Hubert Egger; Alex Olwal; Michael Haller

Todays commercially available prosthetic limbs lack tactile sensation and feedback. Recent research in this domain focuses on sensor technologies designed to be directly embedded into future prostheses. We present a novel concept and prototype of a prosthetic-sensing wearable that offers a non-invasive, self-applicable and customizable approach for the sensory augmentation of present-day and future low to mid-range priced lower-limb prosthetics. From consultation with eight lower-limb amputees, we investigated the design space for prosthetic sensing wearables and developed novel interaction methods for dynamic, user-driven creation and mapping of sensing regions on the foot to wearable haptic feedback actuators. Based on a pilot-study with amputees, we assessed the utility of our design in scenarios brought up by the amputees and we summarize our findings to establish future directions for research into using smart textiles for the sensory enhancement of prosthetic limbs.


human factors in computing systems | 2016

FlexTiles: A Flexible, Stretchable, Formable, Pressure-Sensitive, Tactile Input Sensor

Patrick Parzer; Kathrin Probst; Teo Babic; Christian Rendl; Anita Vogl; Alex Olwal; Michael Haller

In the FlexTiles demonstration, we present a flexible, stretchable, pressure-sensitive, tactile input sensor consisting of three layers of fabric. We demonstrate the implementation of FlexTiles for covering large areas, 3D objects, and deformable underlying shapes. In order to measure these large areas with high framerate, we demonstrate a simple measurement implementation. Finally, we outline the benefits of our system compared to other tactile sensing techniques.


human factors in computing systems | 2017

StretchEBand: Enabling Fabric-based Interactions through Rapid Fabrication of Textile Stretch Sensors

Anita Vogl; Patrick Parzer; Teo Babic; Joanne Leong; Alex Olwal; Michael Haller

The increased interest in interactive soft materials, such as smart clothing and responsive furniture, means that there is a need for flexible and deformable electronics. In this paper, we focus on stitch-based elastic sensors, which have the benefit of being manufacturable with textile craft tools that have been used in homes for centuries. We contribute to the understanding of stitch-based stretch sensors through four experiments and one user study that investigate conductive yarns from textile and technical perspectives, and analyze the impact of different stitch types and parameters. The insights informed our design of new stretch-based interaction techniques that emphasize eyes-free or causal interactions. We demonstrate with StretchEBand how soft, continuous sensors can be rapidly fabricated with different parameters and capabilities to support interaction with a wide range of performance requirements across wearables, mobile devices, clothing, furniture, and toys.


user interface software and technology | 2017

Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality

Inrak Choi; Heather Culbertson; Mark R. Miller; Alex Olwal; Sean Follmer

Ungrounded haptic devices for virtual reality (VR) applications lack the ability to convincingly render the sensations of a grasped virtual objects rigidity and weight. We present Grabity, a wearable haptic device designed to simulate kinesthetic pad opposition grip forces and weight for grasping virtual objects in VR. The device is mounted on the index finger and thumb and enables precision grasps with a wide range of motion. A unidirectional brake creates rigid grasping force feedback. Two voice coil actuators create virtual force tangential to each finger pad through asymmetric skin deformation. These forces can be perceived as gravitational and inertial forces of virtual objects. The rotational orientation of the voice coil actuators is passively aligned with the real direction of gravity through a revolute joint, causing the virtual forces to always point downward. This paper evaluates the performance of Grabity through two user studies, finding promising ability to simulate different levels of weight with convincing object rigidity. The first user study shows that Grabity can convey various magnitudes of weight and force sensations to users by manipulating the amplitude of the asymmetric vibration. The second user study shows that users can differentiate different weights in a virtual environment using Grabity.


human factors in computing systems | 2017

shiftIO: Reconfigurable Tactile Elements for Dynamic Affordances and Mobile Interaction

Evan Strasnick; Jackie Yang; Kesler Tanner; Alex Olwal; Sean Follmer

Currently, virtual (i.e. touchscreen) controls are dynamic, but lack the advantageous tactile feedback of physical controls. Similarly, devices may also have dedicated physical controls, but they lack the flexibility to adapt for different contexts and applications. On mobile and wearable devices in particular, space constraints further limit our input and output capabilities. We propose utilizing reconfigurable tactile elements around the edge of a mobile device to enable dynamic physical controls and feedback. These tactile elements can be used for physical touch input and output, and can reposition according to the application both around the edge of and hidden within the device. We present shiftIO, two implementations of such a system which actuate physical controls around the edge of a mobile device using magnetic locomotion. One version utilizes PCB-manufactured electromagnetic coils, and the other uses switchable permanent magnets. We perform a technical evaluation of these prototypes and compare their advantages in various applications. Finally, we demonstrate several mobile applications which leverage shiftIO to create novel mobile interactions.

Collaboration


Dive into the Alex Olwal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ramesh Raskar

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Leithinger

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hiroshi Ishii

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Akimitsu Hogge

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge