Austin S. Lee
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Austin S. Lee.
acm multimedia | 2015
Henry Chen; Austin S. Lee; Mark Robert Swift; John C. Tang
This paper describes a new Augmented Reality (AR) system called HoloLens developed by Microsoft, and the interaction model for supporting collaboration in this space with other users. Whereas traditional AR collaboration is between two or more head-mounted displays (HMD) users, we describe collaboration between a single HMD user and others who join the space by hitching on the view of the HMD user. The remote companions participate remotely through Skype-enabled devices such as tablets or PCs. The interaction is novel in the use of a 3D space with digital objects where the interaction by remote parties can be achieved asynchronously and reflected back to the primary user. We describe additional collaboration scenarios possible with this arrangement.
tangible and embedded interaction | 2017
Dixon Lo; Jiyoung Ko; Austin S. Lee
In this paper, we describe ShapeShift, our initial exploration into creating a holistic object augmentation by extending physical properties of objects with augmented reality. As a first step, we choose manipulations of shading and shadow for exploration. Shading and shadow are form-giving properties of an object. These attributes play two main roles in perceiving the object: (1) they delineate physical properties of the object (form and texture) and (2) contextualise the objects relation to its environment. This project aims to give shadow and shading the ability to change the perception of an object, allowing for new affordances and interactions to emerge.
tangible and embedded interaction | 2017
Marisa Lu; Gautam Bose; Austin S. Lee; Peter Scupelli
When a person gets to a door and wants to get in, what do they do? They knock. In our system, the users specific knock pattern authenticates their identity, and opens the door for them. The system empowers peoples intuitive actions and responses to affect the world around them in a new way. We leverage IOT, and physical computing to make more technology feel like less. From there, the system of a knock based entrance creates affordances in social interaction for shared spaces wherein ownership fluidity and accessibility needs to be balanced with security
tangible and embedded interaction | 2017
Denis Vlieghe; Austin S. Lee; Wayne Chung
Our work explores the vision of integrating 3d data collected through photogrammetry into the source of the information in a physical space leveraging Mixed Reality (MR) interface platform. As a proof-of-concept prototype design, we show augmentation of the 3D data associated with the physical surrounding using HoloLens and Photogrammetry technique. Our goals are two folds: First, to introduce novel interaction design technique to duplicate and superimpose the physical world virtually in a Mixed Reality setting. Second, to understand how people will interpret and interact in this new hybrid space where artistic photogrammetry 3D renderings of the space are overlaid onto the real world as a multidimensional Mixed Reality element.
tangible and embedded interaction | 2016
Austin S. Lee; Dhairya Dand
This proposal presents a modular system design for a set of programmable tools with various form factors inspired by the relations between human body parts and industrial elements. By providing functional forms to sensors and actuators, and tangible methods of programming behaviors to the objects, we propose a more customizable experience in the area of the Internet of Things. We explore the design space through studying motions in everyday elements and introduce applications of digitally enabled modules in form factors such as hinge, joints, zipper etc. Lastly, we investigate physical ways to program these modules that affords playful interactions in the tangible world. The workshop will first focus on using probes, brainstorming toolkits to generate the design ideas related to themes such as super-powers, environments as extension of the body and next generation of ubiquitous computing [5]. By using the resources and tool-kits provided by the workshop organizers, participants will generate proof-of-concept prototype as final deliverables. Participants are required to bring personal laptops.
tangible and embedded interaction | 2016
Dixon Lo; Austin S. Lee
In this paper we introduce Click, a physical coding platform that utilizes smart devices as component pieces. Click encourages group learning of coding by turning individual smart devices into code blocks. These code blocks can then be connected to form programs. By contributing more personal devices to the code chain, users are able to increase program complexity. Because code blocks in Click have a physical and virtual component, we designed virtual interactions that encourage physical manipulation of devices. Finally we show example programs that can be built on the system. Code chains are able to use inbuilt and installed smart device hardware and software as inputs/outputs, because of this, the power of resulting programs grow in line with advances in smart device technology.
tangible and embedded interaction | 2017
Manya Krishnaswamy; Bori Lee; Chirag Murthy; Hannah Rosenfeld; Austin S. Lee
tangible and embedded interaction | 2016
Rachel S. Ng; Raghavendra Kandala; Sarah Marie-Foley; Dixon Lo; Molly Wright Steenson; Austin S. Lee
tangible and embedded interaction | 2017
Meric Dagli; Julia Petrich; Rossa Kim; Nehal Vora; Austin S. Lee
Archive | 2016
Henry Yao-tsu Chen; Brandon V. Taylor; Mark Robert Swift; Austin S. Lee; Ryan S. Menezes