Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John Helmes is active.

Publication


Featured researches published by John Helmes.


conference on computer supported cooperative work | 2010

Telling the whole story: anticipation, inspiration and reputation in a field deployment of TellTable

Xiang Cao; Siân E. Lindley; John Helmes; Abigail Sellen

We present a field study of TellTable, a new storytelling system designed to support creativity and collaboration amongst children. The application was deployed on a multi-touch interactive table in the library of a primary school, where children could use it to create characters and scenery based on elements of the physical world (captured through photography) as well as through drawing. These could then be used to record a story which could be played back. TellTable allowed children to collaborate in devising stories that mixed the physical and the digital in creative ways and that could include themselves as characters. Additionally, the field deployment illustrated how children took inspiration from one anothers stories, how they planned elements of their own tales before using the technology, and how the fact that stories could be accessed in the library led some to become well-known and popular within the school community. The real story here, we argue, needs to take into account all that happens within the wider context of use of this system.


user interface software and technology | 2009

Mouse 2.0: multi-touch meets the mouse

Nicolas Villar; Shahram Izadi; Dan Rosenfeld; Hrvoje Benko; John Helmes; Jonathan Westhues; Steve Hodges; Eyal Ofek; Alex Butler; Xiang Cao; Billy Chen

In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.


Human-Computer Interaction | 2012

Food for talk : phototalk in the context of sharing a meal

Kenton O'Hara; John Helmes; Abigail Sellen; Richard Harper; Martijn ten Bhömer; Elise van den Hoven

Photographic mementos are important signifiers of our personal memories. Rather than simply passive representations of memories to “preserve” the past, these photos are actively displayed and consumed in the context of everyday behavior and social practices. Within the context of these settings, these mementos are invoked in particular ways to mobilize particular social relations in the present. Taking this perspective, we explore how photo mementos come to be used in the everyday social setting of sharing meal. Rather than a simple concern with nutritional consumption, the shared meal is a social event and important cultural site in the organization of family and social life with culturally specific rhythms, norms, rights, and responsibilities. We present a system—4 Photos—that situates photo mementos within the social concerns of these settings. The system collates photo mementos from those attending the meal and displays them at the dining table to be interacted with by all. Through a real-world deployment of the system, we explore the social work performed by invoking these personal memory resources in the context of real-world settings of shared eating. We highlight particular features of the system that enable this social work to be achieved.


human factors in computing systems | 2014

Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard

Stuart Taylor; Cem Keskin; Otmar Hilliges; Shahram Izadi; John Helmes

We present a new type of augmented mechanical keyboard, capable of sensing rich and expressive motion gestures performed both on and directly above the device. Our hardware comprises of low-resolution matrix of infrared (IR) proximity sensors interspersed between the keys of a regular mechanical keyboard. This results in coarse but high frame-rate motion data. We extend a machine learning algorithm, traditionally used for static classification only, to robustly support dynamic, temporal gestures. We propose the use of motion signatures a technique that utilizes pairs of motion history images and a random forest based classifier to robustly recognize a large set of motion gestures on and directly above the keyboard. Our technique achieves a mean per-frame classification accuracy of 75.6% in leave-one-subject-out and 89.9% in half-test/half-training cross-validation. We detail our hardware and gesture recognition algorithm, provide performance and accuracy numbers, and demonstrate a large set of gestures designed to be performed with our device. We conclude with qualitative feedback from users, discussion of limitations and areas for future work.


tangible and embedded interaction | 2011

Rudiments 1, 2 & 3: design speculations on autonomy

John Helmes; Alex S. Taylor; Xiang Cao; Kristina Höök; Peter Schmitt; Nicolas Villar

This work describes the design process and installation of three speculative, rudimentary machines, or rudiments. Through careful iterations in their design, the rudiments are intended to provoke curiosity and discussion around the possibility of autonomy in interactive systems. The design of the rudiments is described in detail, alongside the design decisions that were made to suggest a machine autonomy and to provoke discussion. Some preliminary reflections from installing the rudiments in two separate households are also reported. Widely divergent opinions of the rudiments from the two households are used to discuss a number of themes for thinking about autonomy and interactive systems design. Overall, the presented work adopts a perspective strongly oriented towards guiding future research, but, importantly, aims to do so by opening up and exposing the design possibilities rather than constraining them.


interactive tabletops and surfaces | 2009

Developing the story: designing an interactive storytelling application

John Helmes; Xiang Cao; Siân E. Lindley; Abigail Sellen

This paper describes the design of a tabletop storytelling application for children, called TellTable. The goal of the system was to stimulate creativity and collaboration by allowing children to develop their own story characters and scenery through photography and drawing, and record stories through direct manipulation and narration. Here we present the initial interface design and its iteration following the results of a preliminary trial. We also describe key findings from TellTables deployment in a primary school that relate to its design, before concluding with a discussion of design implications from the process.


tangible and embedded interaction | 2013

Exploring physical prototyping techniques for functional devices using .NET gadgeteer

Steve Hodges; Stuart Taylor; Nicolas Villar; James Scott; John Helmes

In this paper we present a number of different physical construction techniques for prototyping functional electronic devices. Some of these approaches are already well established whilst others are more novel; our aim is to briefly summarize some of the main categories and to illustrate them with real examples. Whilst a number of different tools exist for building working device prototypes, for consistency the examples we present here are all built using the Microsoft .NET Gadgeteer platform. Although this naturally constrains the scope of this study, it also facilitates a basic comparison of the different techniques. Our ultimate aim is to enable others in the field to learn from our experiences and the techniques we present.


human factors in computing systems | 2013

An interactive belt-worn badge with a retractable string-based input mechanism

Norman Pohl; Steve Hodges; John Helmes; Nicolas Villar; Tim Paek

In this paper we explore a new type of wearable computing device, an interactive identity badge. An embedded LCD presents dynamic information to the wearer and interaction is facilitated by sensing movement of the retractable string which attaches the unit to the wearers belt. This form-factor makes it possible to interact using a single hand, providing lightweight and immediate access to a variety of information when its not convenient to pick up, unlock and interact directly with a device like a smartphone. In this paper we present our prototype interactive badge, demonstrate the underlying technology and describe a number of usage scenarios and interaction techniques


international conference on human computer interaction | 2011

Meerkat and tuba: design alternatives for randomness, surprise and serendipity in reminiscing

John Helmes; Kenton O'Hara; Nicolas Vilar; Alex S. Taylor

People are accumulating large amounts of personal digital content that play a role in reminiscing practices. But as these collections become larger, and older content is less frequently accessed, much of this content is simply forgotten. In response to this we explore the notions of randomness and serendipity in the presentation of content from peoples digital collections. To do this we designed and deployed two devices - Meerkat and Tuba - that enable the serendipitous presentation of digital content from peoples personal media collections. Each device emphasises different characteristics of serendipity that with a view to understanding whether people interpret and value these in different ways while reminiscing. In order explore the use of the devices in context, we deployed in real homes. We report on findings from the study and discuss their implications for design.


user interface software and technology | 2016

Exploring the Design Space for Energy-Harvesting Situated Displays

Tobias Grosse-Puppendahl; Steve Hodges; Nicholas Chen; John Helmes; Stuart Taylor; James Scott; Josh Fromm; David Sweeney

We explore the design space of energy-neutral situated displays, which give physical presence to digital information. We investigate three central dimensions: energy sources, display technologies, and wireless communications. Based on the power implications from our analysis, we present a thin, wireless, photovoltaic-powered display that is quick and easy to deploy and capable of indefinite operation in indoor lighting conditions. The display uses a low-resolution e-paper architecture, which is 35 times more energy-efficient than smaller-sized high-resolution displays. We present a detailed analysis on power consumption, photovoltaic energy harvesting performance, and a detailed comparison to other display-driving architectures. Depending on the ambient lighting, the display can trigger an update every 1 -- 25 minutes and communicate to a PC or smartphone via Bluetooth Low-Energy.

Collaboration


Dive into the John Helmes's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge