Steven Strachan
Hamilton Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steven Strachan.
ubiquitous computing | 2009
Steven Strachan; Roderick Murray-Smith
We introduce a mobile spatial interactive application that uses a combination of a GPS, inertial sensing, gestural interaction, probabilistic models and Monte Carlo sampling, with vibration and audio feedback. This system allows the probing or querying of targets in a local area, based on a model of the local environment and specific context variables of interest, to enable a rich, embodied and location–aware spatial interaction. An experiment was conducted to investigate how spatial target selection at different distances, target separations and target widths is affected by a system with added ‘typical’ noise characteristics. Results showed that the successful selection of targets in the virtual environment is maximised with a combination of high angular separation and angular width.
human factors in computing systems | 2007
Steven Strachan; Roderick Murray-Smith; M. Sile O'Modhrain
We describe the BodySpace system, which uses inertial sensing and pattern recognition to allow the gestural control of a music player by placing the device at different parts of the body. We demonstrate a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based techniques can shape gestural interaction.
human factors in computing systems | 2007
Steven Strachan; John Williamson; Roderick Murray-Smith
We demonstrate the use of uncertain prediction in asystem for pedestrian navigation via audio with a combination ofGlobal Positioning System data, a music player, inertial sensing,magnetic bearing data and Monte Carlo sampling for a densityfollowing task, where a listeners music is modulated according tothe changing predictions of user position with respect to a targetdensity, in this case a trajectory or path. We show that this system enables eyes-free navigation around set trajectories or paths unfamiliar to the user and demonstrate that the system may be used effectively for varying trajectory width and context.
human-computer interaction with mobile devices and services | 2006
John Williamson; Steven Strachan; Roderick Murray-Smith
We present a mobile, GPS-based multimodal navigation system, equipped with inertial control that allows users to explore and navigate through an augmented physical space, incorporating and displaying the uncertainty resulting from inaccurate sensing and unknown user intentions. The system propagates uncertainty appropriately via Monte Carlo sampling and predicts at a user-controllable time horizon. Control of the Monte Carlo exploration is entirely tilt-based. The system output is displayed both visually and in audio. Audio is rendered via granular synthesis to accurately display the probability of the user reaching targets in the space. We also demonstrate the use of uncertain prediction in a trajectory following task, where a section of music is modulated according to the changing predictions of user position with respect to the target trajectory. We show that appropriate display of the full distribution of potential future users positions with respect to sites-of-interest can improve the quality of interaction over a simplistic interpretation of the sensed data.
human-computer interaction with mobile devices and services | 2004
Steven Strachan; Roderick Murray-Smith; Ian Oakley; Jussi Ängeslevä
We describe the implementation of an interaction technique which allows users to store and retrieve information and computational functionality on different parts of their body. We present a dynamic systems approach to gestural interaction using Dynamic Movement Primitives, which model a gesture as a second order dynamic system followed by a learned nonlinear transformation. We demonstrate that it is possible to learn models, even from single examples, which can simulate and classify the gestures needed for the Body Space project, running on a PocketPC with a 3-degree of freedom linear accelerometer.
nordic conference on human-computer interaction | 2008
Steven Strachan; Roderick Murray-Smith
Rotational dynamic system models can be used to enrich tightly-coupled, bearing-aware embodied control of movement-sensitive mobile devices and support a more bidirectional, negotiated style of interaction. A simulated rotational spring system is used to provide natural eyes-free feedback in both the audio and haptic channels in a geosocial mobile networking context.
HAID 2013 Revised Selected Papers of the 8th International Workshop on Haptic and Audio Interaction Design - Volume 7989 | 2013
Sabrina A. Panëels; Lucie Brunet; Steven Strachan
Many wearable haptic devices have been developed for providing passive directional cues, in the form of belts or back displays but these systems have so far failed to make an impact in the public domain. One other potential solution is a light, discrete and aesthetically acceptable vibrotactile bracelet. However, contrary to these other systems, the wrist is subject to rotations, therefore a controversial locus for vibrotactile feedback in a navigational context. This paper presents a set of experiments aimed at both determining the basic feasibility of using this kind of bracelet and to examine to what extent the orientation of the users wrist affects their perception of directional cues both in static and mobile conditions. It was found that changes in orientation have little negative effect overall, distraction being more of a concern.
human computer interaction with mobile devices and services | 2009
Steven Strachan; Grégoire Lefebvre; Sophie Zijp-Rouzier
An approach to providing tangible feedback to users of a mobile device in both highly visual touchscreen-based and eyes-free interaction scenarios and the transition between the two is presented. A rotational dynamical systems metaphor for the provision of feedback is proposed, which provides users with physically based feedback via the audio, tactile and visual senses. By using a consistent metaphor in this way it is possible to support the seamless movement between highly visual touch-based interaction and eyes-free gestural interaction.
human factors in computing systems | 2008
Roderick Murray-Smith; John Williamson; Stephen A. Hughes; Torben Quaade; Steven Strachan
Stane is a hand-held interaction device controlled by tactile input: scratching or rubbing textured surfaces and tapping. The system has a range of sensors, including contact microphones, capacitive sensing and inertial sensing, and provides audio and vibrotactile feedback. The surface textures vary around the device, providing perceivably different textures to the user. We demonstrate that the vibration signals generated by stroking and scratching these surfaces can be reliably classified, and can be used as a very cheap to manufacture way to control different aspects of interaction. The system is demonstrated as a control for a music player, and in a mobile spatial interaction scenario.
location and context awareness | 2009
Steven Strachan; Roderick Murray-Smith
With the recent introduction of mass-market mobile phones with location, bearing and acceleration sensing, we are on the cusp of significant progress in location-based interaction, and highly interactive mobile social networking. We propose that such systems must work when subject to typical uncertainties in the sensed or inferred context, such as user location, bearing and motion. In order to examine the feasibility of such a system we describe an experiment with an eyes-free, mobile implementation which allows users to find a target user, engage with them by pointing and tilting actions, then have their attention directed to a specific target. Although weaknesses in the design of the tilt---distance mapping were indicated, encouragingly, users were able to track the target, and engage with the other agent.