Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Leithinger is active.

Publication


Featured researches published by Daniel Leithinger.


user interface software and technology | 2013

inFORM: dynamic physical affordances and constraints through shape and object actuation

Sean Follmer; Daniel Leithinger; Alex Olwal; Akimitsu Hogge; Hiroshi Ishii

Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.


user interface software and technology | 2012

Jamming user interfaces: programmable particle stiffness and sensing for malleable and shape-changing devices

Sean Follmer; Daniel Leithinger; Alex Olwal; Nadia Cheng; Hiroshi Ishii

Malleable and organic user interfaces have the potential to enable radically new forms of interactions and expressiveness through flexible, free-form and computationally controlled shapes and displays. This work, specifically focuses on particle jamming as a simple, effective method for flexible, shape-changing user interfaces where programmatic control of material stiffness enables haptic feedback, deformation, tunable affordances and control gain. We introduce a compact, low-power pneumatic jamming system suitable for mobile devices, and a new hydraulic-based technique with fast, silent actuation and optical shape sensing. We enable jamming structures to sense input and function as interaction devices through two contributed methods for high-resolution shape sensing using: 1) index-matched particles and fluids, and 2) capacitive and electric field sensing. We explore the design space of malleable and organic user interfaces enabled by jamming through four motivational prototypes that highlight jammings potential in HCI, including applications for tabletops, tablets and for portable shape-changing mobile devices.


tangible and embedded interaction | 2010

Relief: a scalable actuated shape display

Daniel Leithinger; Hiroshi Ishii

Relief is an actuated tabletop display, which is able to render and animate three-dimensional shapes with a malleable surface. It allows users to experience and form digital models like geographical terrain in an intuitive manner. The tabletop surface is actuated by an array of 120 motorized pins, which are controlled with a low-cost, scalable platform built upon open-source hardware and software tools. Each pin can be addressed individually and senses user input like pulling and pushing.


user interface software and technology | 2014

Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration

Daniel Leithinger; Sean Follmer; Alex Olwal; Hiroshi Ishii

We propose a new approach to Physical Telepresence, based on shared workspaces with the ability to capture and remotely render the shapes of people and objects. In this paper, we describe the concept of shape transmission, and propose interaction techniques to manipulate remote physical objects and physical renderings of shared digital content. We investigate how the representation of users body parts can be altered to amplify their capabilities for teleoperation. We also describe the details of building and testing prototype Physical Telepresence workspaces based on shape displays. A preliminary evaluation shows how users are able to manipulate remote objects, and we report on our observations of several different manipulation techniques that highlight the expressive nature of our system.


user interface software and technology | 2011

Direct and gestural interaction with relief: a 2.5D shape display

Daniel Leithinger; Dávid Lakatos; Anthony DeVincenzi; Matthew Blackshaw; Hiroshi Ishii

Actuated shape output provides novel opportunities for experiencing, creating and manipulating 3D content in the physical world. While various shape displays have been proposed, a common approach utilizes an array of linear actuators to form 2.5D surfaces. Through identifying a set of common interactions for viewing and manipulating content on shape displays, we argue why input modalities beyond direct touch are required. The combination of freehand gestures and direct touch provides additional degrees of freedom and resolves input ambiguities, while keeping the locus of interaction on the shape output. To demonstrate the proposed combination of input modalities and explore applications for 2.5D shape displays, two example scenarios are implemented on a prototype system.


international conference on artificial reality and telexistence | 2006

Shared design space: sketching ideas using digital pens and a large augmented tabletop setup

Michael Haller; Peter Brandl; Daniel Leithinger; Jakob Leitner; Thomas Seifried; Mark Billinghurst

Collaborative Augmented Reality (AR) setups are becoming increasingly popular. We have developed a collaborative tabletop environment that is designed for brainstorming and discussion meetings. Using a digital pen, participants can annotate not only virtual paper, but also real printouts. By integrating both forms of physical and digital paper, we combine virtual and real 2d drawings, and digital data which are overlaid into a single information space. In this paper, we describe why we have integrated these devices together in a unique way and how they can be used efficiently during a design process.


ieee international workshop on horizontal interactive human computer systems | 2007

Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus

Daniel Leithinger; Michael Haller

Japanese calligraphy is the art of brush writing where a person writes Japanese characters with a Chinese brush against a sheet of paper. We have implemented a mechanism to capture the process of producing Japanese calligraphy using MERLs DiamondTouch (DT) table. We add a very thin metal wire along the length of the brush to carry an electric signal from the writers body through the brush tuft and ink to the table. As the brush tuft is touches a sheet of paper placed on the surface of the DT table, the ink in the tuft carries the signal from the users to DT. We capture the movement of the brush tuft to produce the visual and auditory representations of the writing process and for later replay.Many digital tabletop systems have a graphical user interface (GUI) that features context (or pop-up) menus. While linear and pie menus are commonly used for direct pen and touch interaction, their appearance can be problematic on a digital tabletop display, where physical objects might occlude menu items. We propose a user-drawn path menu, that appears along a custom path to avoid such occlusions. This paper introduces four different metaphors for user-drawn context menus: the Fan Out Menu, the Card Deck Menu, the Pearl String Menu, and the Trail Menu. It also presents the results we acquired from a user study, where participants were able to work faster when using our user-drawn menus, on cluttered tabletop setups.


human factors in computing systems | 2013

Sublimate: state-changing virtual and physical rendering to augment interaction with shape displays

Daniel Leithinger; Sean Follmer; Alex Olwal; Samuel Luescher; Akimitsu Hogge; Jinha Lee; Hiroshi Ishii

Recent research in 3D user interfaces pushes towards immersive graphics and actuated shape displays. Our work explores the hybrid of these directions, and we introduce sublimation and deposition, as metaphors for the transitions between physical and virtual states. We discuss how digital models, handles and controls can be interacted with as virtual 3D graphics or dynamic physical shapes, and how user interfaces can rapidly and fluidly switch between those representations. To explore this space, we developed two systems that integrate actuated shape displays and augmented reality (AR) for co-located physical shapes and 3D graphics. Our spatial optical see-through display provides a single user with head-tracked stereoscopic augmentation, whereas our handheld devices enable multi-user interaction through video seethrough AR. We describe interaction techniques and applications that explore 3D interaction for these new modalities. We conclude by discussing the results from a user study that show how freehand interaction with physical shape displays and co-located graphics can outperform wand-based interaction with virtual 3D graphics.


tangible and embedded interaction | 2010

g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface

Jamie Zigelbaum; Alan Browning; Daniel Leithinger; Olivier Bau; Hiroshi Ishii

In this paper we present g-stalt, a gestural interface for interacting with video. g-stalt is built upon the g-speak spatial operating environment (SOE) from Oblong Industries. The version of g-stalt presented here is realized as a three-dimensional graphical space filled with over 60 cartoons. These cartoons can be viewed and rearranged along with their metadata using a specialized gesture set. g-stalt is designed to be chirocentric, spatiotemporal, and telekinetic.


international conference on computer graphics and interactive techniques | 2006

The shared design space

Michael Haller; Daniel Leithinger; Jakob Leitner; Thomas Seifried; Peter Brandl; Jürgen Zauner; Mark Billinghurst

The Shared Design Space is a novel interface for enhancing face-to-face collaboration using multiple displays and input surfaces. The system supports natural gestures and paper-pen input and overcomes the limitations of using traditional technology in co-located meetings and brainstorming activities.

Collaboration


Dive into the Daniel Leithinger's collaboration.

Top Co-Authors

Avatar

Hiroshi Ishii

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ken Nakagaki

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anthony DeVincenzi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dávid Lakatos

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew Blackshaw

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Philipp Schoessler

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Akimitsu Hogge

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge