Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jesse Burstyn is active.

Publication


Featured researches published by Jesse Burstyn.


tangible and embedded interaction | 2013

FlexView: an evaluation of depth navigation on deformable mobile devices

Jesse Burstyn; Amartya Banerjee; Roel Vertegaal

We present FlexView, a set of interaction techniques for Z-axis navigation on touch-enabled flexible mobile devices. FlexView augments touch input with bend to navigate through depth-arranged content. To investigate Z-axis navigation with FlexView, we measured document paging efficiency using touch, against two forms of bend input: bending the side of the display (leafing) and squeezing the display (squeezing). In addition to moving through the Z-axis, the second experiment added X-Y navigation in a pan-and-zoom task. Pinch gestures were compared to squeezing and leafing for zoom operations, while panning was consistently performed using touch. Our experiments demonstrate that bend interaction is comparable to touch input for navigation through stacked content. Squeezing to zoom recorded the fastest times in the pan-and-zoom task. Overall, FlexView allows users to easily browse depth arranged information spaces without sacrificing traditional touch interactions.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2012

MultiPoint: Comparing laser and manual pointing as remote input in large display interactions

Amartya Banerjee; Jesse Burstyn; Audrey Girouard; Roel Vertegaal

We present MultiPoint, a set of perspective-based remote pointing techniques that allows users to perform bimanual and multi-finger remote manipulation of graphical objects on large displays. We conducted two empirical studies that compared remote pointing techniques performed using fingers and laser pointers, in single and multi-finger pointing interactions. We explored three types of manual selection gestures: squeeze, breach and trigger. The fastest and most preferred technique was the trigger gesture in the single point experiment and the unimanual breach gesture in the multi-finger pointing study. The laser pointer obtained mixed results: it is fast, but inaccurate in single point, and it obtained the lowest ranking and performance in the multipoint experiment. Our results suggest MultiPoint interaction techniques are superior in performance and accuracy to traditional laser pointers for interacting with graphical objects on a large display from a distance.


tangible and embedded interaction | 2015

DisplaySkin: Exploring Pose-Aware Displays on a Flexible Electrophoretic Wristband

Jesse Burstyn; Paul Strohmeier; Roel Vertegaal

Mobile devices can provide people with contextual information. This information may benefit a primary activity, assuming it is easily accessible. In this paper, we present DisplaySkin, a pose-aware device with a flexible display circling the wrist. DisplaySkin creates a kinematic model of a users arm and uses it to place information in view, independent of body pose. In doing so, DisplaySkin aims to minimize the cost of accessing information without being intrusive. We evaluated our pose-aware display with a rotational pointing task, which was interrupted by a notification on DisplaySkin. Results show that a pose-aware display reduces the time required to respond to notifications on the wrist.


international conference on human-computer interaction | 2015

PrintPut: Resistive and Capacitive Input Widgets for Interactive 3D Prints

Jesse Burstyn; Nicholas Fellion; Paul Strohmeier; Roel Vertegaal

We introduce PrintPut, a method for 3D printing that embeds interactivity directly into printed objects. PrintPut uses conductive filament to offer an assortment of sensors that an industrial designer can easily incorporate into their 3D designs, including buttons, pressure sensors, sliders, touchpads, and flex sensors. PrintPut combines physical and interactive sketching into the same process: seamlessly integrating sensors onto the surfaces of 3D objects, without the need for external sensor hardware.


human factors in computing systems | 2011

WaveForm: remote video blending for VJs using in-air multitouch gestures

Amartya Banerjee; Jesse Burstyn; Audrey Girouard; Roel Vertegaal

We present WaveForm, a system that enables a Video Jockey (VJ) to directly manipulate video content on a large display on a stage, from a distance. WaveForm implements an in-air multitouch gesture set to layer, blend, scale, rotate, and position video content on the large display. We believe this leads to a more immersive experience for the VJ user, as well as for the audience witnessing the VJs performance during a live event.


human factors in computing systems | 2010

gBook: an e-book reader with physical document navigation techniques

Jesse Burstyn; M. Anson Herriotts

In this paper, we present gBook, a prototype for a new style of e-book reader that uses flexible inputs and page orientation to simulate the properties of reading a bound printed book. This project takes into account some of the known methods that people use when reading books, to make page navigation correspond more to that of paper-based books. The underlying assumption is that doing so will improve the learnability of navigation, as well as the usability by allowing more casual methods of page navigation.


interactive tabletops and surfaces | 2011

Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops

Amartya Banerjee; Jesse Burstyn; Audrey Girouard; Roel Vertegaal


tangible and embedded interaction | 2016

ReFlex: A Flexible Smartphone with Active Haptic Feedback for Bend Input

Paul Strohmeier; Jesse Burstyn; Juan Pablo Carrascal; Vincent Levesque; Roel Vertegaal


user interface software and technology | 2013

Flexkit: a rapid prototyping platform for flexible displays

David Holman; Jesse Burstyn; Ryan S. Brotman; Audrey C. Younkin; Roel Vertegaal


human factors in computing systems | 2016

HoloFlex: A Flexible Holographic Smartphone with Bend Input

Daniel Gotsch; Xujing Zhang; Jesse Burstyn; Roel Vertegaal

Collaboration


Dive into the Jesse Burstyn's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge