Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sean Follmer is active.

Publication


Featured researches published by Sean Follmer.


user interface software and technology | 2013

inFORM: dynamic physical affordances and constraints through shape and object actuation

Sean Follmer; Daniel Leithinger; Alex Olwal; Akimitsu Hogge; Hiroshi Ishii

Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.


user interface software and technology | 2012

Jamming user interfaces: programmable particle stiffness and sensing for malleable and shape-changing devices

Sean Follmer; Daniel Leithinger; Alex Olwal; Nadia Cheng; Hiroshi Ishii

Malleable and organic user interfaces have the potential to enable radically new forms of interactions and expressiveness through flexible, free-form and computationally controlled shapes and displays. This work, specifically focuses on particle jamming as a simple, effective method for flexible, shape-changing user interfaces where programmatic control of material stiffness enables haptic feedback, deformation, tunable affordances and control gain. We introduce a compact, low-power pneumatic jamming system suitable for mobile devices, and a new hydraulic-based technique with fast, silent actuation and optical shape sensing. We enable jamming structures to sense input and function as interaction devices through two contributed methods for high-resolution shape sensing using: 1) index-matched particles and fluids, and 2) capacitive and electric field sensing. We explore the design space of malleable and organic user interfaces enabled by jamming through four motivational prototypes that highlight jammings potential in HCI, including applications for tabletops, tablets and for portable shape-changing mobile devices.


human factors in computing systems | 2010

Family story play: reading with young children (and elmo) over a distance

Hayes Solos Raffle; Rafael Ballagas; Glenda Revelle; Hiroshi Horii; Sean Follmer; Janet Go; Emily Reardon; Koichi Mori; Joseph 'Jofish' Kaye; Mirjana Spasojevic

We introduce Family Story Play, a system that supports grandparents to read books together with their grandchildren over the Internet. Family Story Play is designed to improve communication across generations and over a distance, and to support parents and grandparents in fostering the literacy development of young children. The interface encourages active child participation in the book reading experience by combining a paper book, a sensor-enhanced frame, video conferencing technology, and video content of a Sesame Street Muppet (Elmo). Results with users indicate that Family Story Play improves child engagement in long-distance communication and increases the quality of interaction between young children and distant grandparents. Additionally, Family Story Play encourages dialogic reading styles that are linked with literacy development. Ultimately, reading with Family Story Play becomes a creative shared activity that suggests a new kind of collaborative story telling.


user interface software and technology | 2014

Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration

Daniel Leithinger; Sean Follmer; Alex Olwal; Hiroshi Ishii

We propose a new approach to Physical Telepresence, based on shared workspaces with the ability to capture and remotely render the shapes of people and objects. In this paper, we describe the concept of shape transmission, and propose interaction techniques to manipulate remote physical objects and physical renderings of shared digital content. We investigate how the representation of users body parts can be altered to amplify their capabilities for teleoperation. We also describe the details of building and testing prototype Physical Telepresence workspaces based on shape displays. A preliminary evaluation shows how users are able to manipulate remote objects, and we report on our observations of several different manipulation techniques that highlight the expressive nature of our system.


user interface software and technology | 2010

CopyCAD: remixing physical objects with copy and paste from the real world

Sean Follmer; David Carr; Emily Lovell; Hiroshi Ishii

This paper introduces a novel technique for integrating geometry from physical objects into computer aided design (CAD) software. We allow users to copy arbitrary real world object geometry into 2D CAD designs at scale through the use of a camera/projector system. This paper also introduces a system, CopyCAD, that uses this technique, and augments a Computer Controlled (CNC) milling machine. CopyCAD gathers input from physical objects, sketches and interactions directly on a milling machine, allowing novice users to copy parts of real world objects, modify them and then create a new physical part.


human factors in computing systems | 2012

KidCAD: digitally remixing toys through tangible tools

Sean Follmer; Hiroshi Ishii

Children have great facility in the physical world, and can skillfully model in clay and draw expressive illustrations. Traditional digital modeling tools have focused on mouse, keyboard and stylus input. These tools may be complicated and difficult for young users to easily and quickly create exciting designs. We seek to bring physical interaction to digital modeling, to allow users to use existing physical objects as tangible building blocks for new designs. We introduce KidCAD a digital clay interface for children to remix toys. KidCAD allows children to imprint 2.5D shapes from physical objects into their digital models by deforming a malleable gel input device, deForm. Users can mashup existing objects, edit and sculpt or draw new designs on a 2.5D canvas using physical objects, hands and tools as well as 2D touch gestures. We report on a preliminary user study with 13 children, ages 7 to 10, which provides feedback for our design and helps guide future work in tangible modeling for children.


user interface software and technology | 2016

Zooids: Building Blocks for Swarm User Interfaces

Mathieu Le Goc; Lawrence H. Kim; Ali Parsaei; Jean-Daniel Fekete; Pierre Dragicevic; Sean Follmer

This paper introduces swarm user interfaces, a new class of human-computer interfaces comprised of many autonomous robots that handle both display and interaction. We describe the design of Zooids, an open-source open-hardware platform for developing tabletop swarm interfaces. The platform consists of a collection of custom-designed wheeled micro robots each 2.6 cm in diameter, a radio base-station, a high-speed DLP structured light projector for optical tracking, and a software framework for application development and control. We illustrate the potential of tabletop swarm user interfaces through a set of application scenarios developed with Zooids, and discuss general design considerations unique to swarm user interfaces.


human factors in computing systems | 2013

Sublimate: state-changing virtual and physical rendering to augment interaction with shape displays

Daniel Leithinger; Sean Follmer; Alex Olwal; Samuel Luescher; Akimitsu Hogge; Jinha Lee; Hiroshi Ishii

Recent research in 3D user interfaces pushes towards immersive graphics and actuated shape displays. Our work explores the hybrid of these directions, and we introduce sublimation and deposition, as metaphors for the transitions between physical and virtual states. We discuss how digital models, handles and controls can be interacted with as virtual 3D graphics or dynamic physical shapes, and how user interfaces can rapidly and fluidly switch between those representations. To explore this space, we developed two systems that integrate actuated shape displays and augmented reality (AR) for co-located physical shapes and 3D graphics. Our spatial optical see-through display provides a single user with head-tracked stereoscopic augmentation, whereas our handheld devices enable multi-user interaction through video seethrough AR. We describe interaction techniques and applications that explore 3D interaction for these new modalities. We conclude by discussing the results from a user study that show how freehand interaction with physical shape displays and co-located graphics can outperform wand-based interaction with virtual 3D graphics.


user interface software and technology | 2011

deForm: an interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch

Sean Follmer; Micah K. Johnson; Edward H. Adelson; Hiroshi Ishii

We introduce a novel input device, deForm, that supports 2.5D touch gestures, tangible tools, and arbitrary objects through real-time structured light scanning of a malleable surface of interaction. DeForm captures high-resolution surface deformations and 2D grey-scale textures of a gel surface through a three-phase structured light 3D scanner. This technique can be combined with IR projection to allow for invisible capture, providing the opportunity for co-located visual feedback on the deformable surface. We describe methods for tracking fingers, whole hand gestures, and arbitrary tangible tools. We outline a method for physically encoding fiducial marker information in the height map of tangible tools. In addition, we describe a novel method for distinguishing between human touch and tangible tools, through capacitive sensing on top of the input surface. Finally we motivate our device through a number of sample applications.


intelligent robots and systems | 2016

Wolverine: A wearable haptic interface for grasping in virtual reality

Inrak Choi; Elliot Wright Hawkes; David L. Christensen; Christopher J. Ploch; Sean Follmer

The Wolverine is a mobile, wearable haptic device designed for simulating the grasping of rigid objects in a virtual reality interface. In contrast to prior work on wearable force feedback gloves, we focus on creating a low cost and lightweight device that renders a force directly between the thumb and three fingers to simulate objects held in pad opposition (precision) type grasps. Leveraging low-power brake-based locking sliders, the system can withstand over 100N of force between each finger and the thumb, and only consumes 0.24 mWh (0.87 joules) for each braking interaction. Integrated sensors are used both for feedback control and user input: time-of-flight sensors provide the position of each finger and an IMU provides overall orientation tracking. This paper describes the mechanical design, control strategy, and performance analysis of the Wolverine system and provides a comparison with several existing wearable haptic devices.

Collaboration


Dive into the Sean Follmer's collaboration.

Top Co-Authors

Avatar

Hiroshi Ishii

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Leithinger

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Artem Dementyev

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph A. Paradiso

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ken Nakagaki

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge