Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sang-won Leigh is active.

Publication


Featured researches published by Sang-won Leigh.


user interface software and technology | 2014

THAW: tangible interaction with see-through augmentation for smartphones on computer screens

Sang-won Leigh; Philipp Schoessler; Felix Heibeck; Pattie Maes; Hiroshi Ishii

In this paper, we present a novel interaction system that allows a collocated large display and small handheld devices to seamlessly work together. The smartphone acts both as a physical interface and as an additional graphics layer for near-surface interaction on a computer screen. Our system enables accurate position tracking of a smartphone placed on or over any screen by displaying a 2D color pattern that is captured using the smartphones back-facing camera. The proposed technique can be implemented on existing devices without the need for additional hardware.


tangible and embedded interaction | 2015

THAW: Tangible Interaction with See-Through Augmentation for Smartphones on Computer Screens

Sang-won Leigh; Philipp Schoessler; Felix Heibeck; Pattie Maes; Hiroshi Ishii

The huge influx of mobile display devices is transforming computing into multi-device interaction, demanding a fluid mechanism for using multiple devices in synergy. In this paper, we present a novel interaction system that allows a collocated large display and a small handheld device to work together. The smartphone acts as a physical interface for near-surface interactions on a computer screen. Our system enables accurate position tracking of a smartphone placed on or over any screen by displaying a 2D color pattern that is captured using the smartphones back-facing camera. As a result, the smartphone can directly interact with data displayed on the host computer, with precisely aligned visual feedback from both devices. The possible interactions are described and classified in a framework, which we exemplify on the basis of several implemented applications. Finally, we present a technical evaluation and describe how our system is unique compared to other existing near-surface interaction systems. The proposed technique can be implemented on existing devices without the need for additional hardware, promising immediate integration into existing systems.


tangible and embedded interaction | 2015

clayodor: Retrieving Scents through the Manipulation of Malleable Material

Cindy Hsin-Liu Kao; Ermal Dreshaj; Judith Amores; Sang-won Leigh; Xavier Benavides; Pattie Maes; Ken Perlin; Hiroshi Ishii

clayodor (\klei-o-dor\) is a clay-like malleable material that changes smell based on user manipulation of its shape. This work explores the tangibility of shape changing materials to capture smell, an ephemeral and intangible sensory input. We present the design of a proof-of-concept prototype, and discussions on the challenges of navigating smell though form.


tangible and embedded interaction | 2015

Remnance of Form: Altered Reflection of Physical Reality

Sang-won Leigh; Asta Roseway; Ann Paradiso

Remnance of Form is an interactive installation that explores the dynamic tension between an object and its shadow. By fusing light, projection, and motion technologies, the shadow can now detach itself from its former role. This creates a new narrative that challenges our perception of reality, whats real and whats not.


The first computers | 2017

Body-Borne Computers as Extensions of Self

Sang-won Leigh; Harpreet Sareen; Hsin-Liu Cindy Kao; Pattie Maes

The opportunities for wearable technologies go well beyond always-available information displays or health sensing devices. The concept of the cyborg introduced by Clynes and Kline, along with works in various fields of research and the arts, offers a vision of what technology integrated with the body can offer. This paper identifies different categories of research aimed at augmenting humans. The paper specifically focuses on three areas of augmentation of the human body and its sensorimotor capabilities: physical morphology, skin display, and somatosensory extension. We discuss how such digital extensions relate to the malleable nature of our self-image. We argue that body-borne devices are no longer simply functional apparatus, but offer a direct interplay with the mind. Finally, we also showcase some of our own projects in this area and shed light on future challenges.


user interface software and technology | 2013

NailSense: fingertip force as a new input modality

Sungjae Hwang; Dongchul Kim; Sang-won Leigh; Kwangyun Wohn

In this paper, we propose a new interaction technique, called NailSense, which allows users to control a mobile device by hovering and slightly bending/extending fingers behind the device. NailSense provides basic interactions equivalent to that of touchscreen interactions; 2-D locations and binary states (i.e., touch or released) are tracked and used for input, but without any need of touching on the screen. The proposed technique tracks the users fingertip in real-time and triggers event on color change in the fingernail area. It works with conventional smartphone cameras, which means no additional hardware is needed for its utilization. This novel technique allows users to use mobile devices without occlusion which was a crucial problem in touchscreens, also promising extended interaction space in the air, on desktop, or in everywhere. This new interaction technique is tested with example applications: a drawing app and a web browser.


human factors in computing systems | 2015

AfterMath: Visualizing Consequences of Actions through Augmented Reality

Sang-won Leigh; Pattie Maes

Computing technology has advanced to a point where computers demonstrate better performance and precision in some analytical tasks than humans. As a result, we see a promising potential to significantly empower our decision-making process by providing relevant information just in time/space. In this paper, we present AfterMath, a user interface concept of predicting and visualizing consequences of a users potential actions. We explore new possibilities through a prototypical system that harnesses physics simulation as a consequence-prediction method and uses augmented reality technology to visualize potential consequences. Finally, we suggest general guidelines for designing systems utilizing this concept.


international conference on computer graphics and interactive techniques | 2014

Coded Lens: using coded aperture for low-cost and versatile imaging

Yusuke Sekikawa; Sang-won Leigh; Koichiro Suzuki

We propose Coded Lens, a novel system for lensless photography. The system does not require highly calibrated optics, but instead, utilizes a coded aperture for guiding lights. Compressed sensing (CS) is used to reconstruct scene from the raw image obtained through the coded aperture. Experimenting with synthetic and real scenes, we show the applicability of the technique and also demonstrate additional functionality such as changing focus programmatically. We believe this will lead to a more compact, cheaper and even versatile imaging systems.


human factors in computing systems | 2017

Morphology Extension Kit: A Modular Robotic Platform for Customizable and Physically Capable Wearables

Sang-won Leigh; Kush Parekh; Timothy Denton; William S. Peebles; Magnus H. Johnson; Pattie Maes

Robotic and shape-changing interfaces hint at a way to incorporate physical materials as extensions for human users, however, rapidly changing environments pose a diverse set of problems that are difficult to solve with a single interface. To address this, we propose a modular hardware platform that allows users or designers to build and customize physical augmentations. The process of building an augmentation is simply to connect actuator and sensor blocks and attach the resulting wearable to the body. The current design can be easily modified to incorporate additional electronics for desired sensing capabilities. Our universal connector designs can be extended to utilize any motors within afforded power, size, and weight constraints.


tangible and embedded interaction | 2016

A Flying Pantograph: Interleaving Expressivity of Human and Machine

Sang-won Leigh; Harshit Agrawal; Pattie Maes

Drawing as a means of expression has evolved over time, as, and through a means of computation: Since pre-historic time, humankind has been involved in drawing through a myriad forms of mediums that, over many years, have evolved to be increasingly computation-driven. However, they largely continue to remain constrained to human body scale and aesthetics, while computer technology now allows a more synergistic and collaborative expression between human and machine. In our installation, we engage audience with a drone-based drawing system that applies a persons pen drawing at different scales in different styles. The unrestricted and programmable motion of the proxy can institute various artistic distortions in real-time, creating a new dynamic medium of creative expression.

Collaboration


Dive into the Sang-won Leigh's collaboration.

Top Co-Authors

Avatar

Pattie Maes

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hiroshi Ishii

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Harshit Agrawal

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Philipp Schoessler

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ermal Dreshaj

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Felix Heibeck

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kush Parekh

Rhode Island School of Design

View shared research outputs
Researchain Logo
Decentralizing Knowledge