Jess McIntosh
University of Bristol
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jess McIntosh.
human factors in computing systems | 2016
Jess McIntosh; Charlie McNeill; Mike Fraser; Frederic Kerber; Markus Löchtefeld; Antonio Krüger
Practical wearable gesture tracking requires that sensors align with existing ergonomic device forms. We show that combining EMG and pressure data sensed only at the wrist can support accurate classification of hand gestures. A pilot study with unintended EMG electrode pressure variability led to exploration of the approach in greater depth. The EMPress technique senses both finger movements and rotations around the wrist and forearm, covering a wide range of gestures, with an overall 10-fold cross validation classification accuracy of 96%. We show that EMG is especially suited to sensing finger movements, that pressure is suited to sensing wrist and forearm rotations, and their combination is significantly more accurate for a range of gestures than either technique alone. The technique is well suited to existing wearable device forms such as smart watches that are already mounted on the wrist.Practical wearable gesture tracking requires that sensors align with existing ergonomic device forms. We show that combining EMG and pressure data sensed only at the wrist can support accurate classification of hand gestures. A pilot study with unintended EMG electrode pressure variability led to exploration of the approach in greater depth. The EMPress technique senses both finger movements and rotations around the wrist and forearm, covering a wide range of gestures, with an overall 10-fold cross validation classification accuracy of 96%. We show that EMG is especially suited to sensing finger movements, that pressure is suited to sensing wrist and forearm rotations, and their combination is significantly more accurate for a range of gestures than either technique alone. The technique is well suited to existing wearable device forms such as smart watches that are already mounted on the wrist. Author
nordic conference on human-computer interaction | 2016
Frederic Kerber; Markus Löchtefeld; Antonio Krüger; Jess McIntosh; Charlie McNeill; Mike Fraser
We investigate one-handed, same-side gestural interactions with wrist-worn devices. We contribute results of an elicitation study with 26 participants from various backgrounds to learn about gestures people would like to do when only able to interact using the arm on which they wear the device, e.g. while carrying something in the opposite hand. Based on the analysis of 1,196 video-taped gestures, 145 atomic gestures could be identified, which in turn were used to create a set of 296 unique gesture combinations. From these, we identified a conflict-free set of 43 gestures to trigger 46 common smartwatch tasks. The results show that symbolic gestures such as drawing a question mark for activating a help function are consistently used across participants. We further found symbolic and continuous gestures to be used significantly more often by men. Based on the results, we derived guidelines that should be considered when designing gestures for SSI.
human factors in computing systems | 2017
Jess McIntosh; Asier Marzo; Mike Fraser; Carol Phillips
Recent improvements in ultrasound imaging enable new opportunities for hand pose detection using wearable devices. Ultrasound imaging has remained under-explored in the HCI community despite being non-invasive, harmless and capable of imaging internal body parts, with applications including smart-watch interaction, prosthesis control and instrument tuition. In this paper, we compare the performance of different forearm mounting positions for a wearable ultrasonographic device. Location plays a fundamental role in ergonomics and performance since the anatomical features differ among positions. We also investigate the performance decrease due to cross-session position shifts and develop a technique to compensate for this misalignment. Our gesture recognition algorithm combines image processing and neural networks to classify the flexion and extension of 10 discrete hand gestures with an accuracy above 98%. Furthermore, this approach can continuously track individual digit flexion with less than 5% NRMSE, and also differentiate between digit flexion at different joints.
interactive tabletops and surfaces | 2015
Asier Marzo; Richard McGeehan; Jess McIntosh; Sue Ann Seah; Sriram Subramanian
Digital art technologies take advantage of the input, output and processing capabilities of modern computers. However, full digital systems lack the tangibility and expressiveness of their traditional counterparts. We present Ghost Touch, a system that remotely actuate the artistic medium with an ultrasound phased array. Ghost Touch transforms a normal surface into an interactive tangible canvas in which the users and the system collaborate in real-time to produce an artistic piece. Ghost Touch is able to detect traces and reproduce them, therefore enabling common digital operations such as copy, paste, save or load whilst maintaining the tangibility of the traditional medium. Ghost Touch has enhanced expressivity since it uses a novel algorithm to generate multiple ultrasound focal points with specific intensity levels. Different artistic effects can be performed on sand, milk&ink or liquid soap.
user interface software and technology | 2017
Jess McIntosh; Asier Marzo; Mike Fraser
Gestures have become an important tool for natural interaction with computers and thus several wearables have been developed to detect hand gestures. However, many existing solutions are unsuitable for practical use due to low accuracy, high cost or poor ergonomics. We present SensIR, a bracelet that uses near-infrared sensing to infer hand gestures. The bracelet is composed of pairs of infrared emitters and receivers that are used to measure both the transmission and reflection of light through/off the wrist. SensIR improves the accuracy of existing infrared gesture sensing systems through the key idea of taking measurements with all possible combinations of emitters and receivers. Our study shows that SensIR is capable of detecting 12 discrete gestures with 93.3% accuracy. SensIR has several advantages compared to other systems such as high accuracy, low cost, robustness against bad skin coupling and thin form-factor.
human factors in computing systems | 2017
Anne-Claire Bourland; Peter Gorman; Jess McIntosh; Asier Marzo
Speech is our innate way of communication. However, it has limitation such as being a broadcast process, it has limited reach and it only works in air. Here, we explore the combination of two technologies for realizing natural targeted communication with potential applications in coordination of tasks or private conversations. For detecting words, we measure the bioelectric signals produced by facial muscles during speech. An electromyographic system composed of 4 surface electrodes had an accuracy of 80% when discriminating between 10 words. More importantly, the system was equally effective in discriminating spoken and silently mouthed words. For transferring the words, we used the sound-through-ultrasound phenomenon to generate audible sound within a narrow beam. We built a phased array of ultrasonic emitters, capable of emitting sound that can be steered electronically without physically moving the array. Two prototypes that combine detection and transfer of words are presented and their limitations analysed.
human factors in computing systems | 2018
Alex Church; Ethan Kenwrick; Yun Park; Luke Hudlass-Galley; Anmol Krishan Sachdeva; Zhiyu Yang; Jess McIntosh; Peter Bennett
We present CuffLink, a wristband designed to let users transfer files between devices intuitively using grab and release hand gestures. We propose to use ultrasonic transceivers to enable device selection through pointing and employ force-sensitive resistors (FSRs) to detect simple hand gestures. Our prototype demonstration of CuffLink shows that the system can successfully transfer files between two computers using gestures. Preliminary testing with users shows that 83% claim they would use a fully working device over typical sharing methods such as Dropbox and Google Drive. Apart from file sharing, we intend to make CuffLink a re-programmable wearable in future.
human factors in computing systems | 2017
Jess McIntosh; Mike Fraser; Paul Worgan; Asier Marzo
Microwaves are a type of electromagnetic radiation that can pass through a variety of commonly found materials but partially reflect off human bodies. Microwaves are non-ionizing and at controlled levels do not pose a danger. A wave that is capable of passing through materials and image humans could have useful applications in human-computer-interaction. However, only recently the full potential of microwaves for interactive devices has begun to be explored. Here, we present a scalable, low-cost system using an array of off-the-shelf microwave Doppler sensors and explore its potential for tabletop interactions. The arrays are installed beneath a desk, making it an ubiquitous device that enables a wide range of interactions such as 3D hand tracking, gesture recognition and different forms of tangible interaction. Given the low cost and availability of these sensors, we expect that this work will stimulate future interactive devices that employ microwave sensors.
Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces | 2017
Jess McIntosh; Mike Fraser
Wearable devices for activity tracking and gesture recognition have expanded rapidly in recent years. One technique that has shown great potential for this is ultrasonic imaging [10][4]. This technique has been shown to have advantages over other techniques in accuracy, surface area, placement and importantly, continuous finger angle estimations. However, ultrasonic imaging suffers from a couple of issues: First and foremost, the propagation of ultrasound into flesh suffers greatly without a suitable coupling medium; Secondly, the complexity of the driving circuitry for medical grade imaging currently renders a wearable version of this infeasible. This paper aims to address these two problems by finding a rigid coupling medium that lasts for significantly longer periods of time; and devising a new sensor configuration to reduce the device complexity, while still retaining the benefits of the technique. Furthermore, a comparison between high and low frequency systems reveal that different devices can be created with this technique for better resolution or convenience respectively.
interactive tabletops and surfaces | 2015
Florent Berthaut; Deepak Ranjan Sahoo; Jess McIntosh; Diptesh Das; Sriram Subramanian
Mirror surfaces are part of our everyday life. Among them, curved mirrors are used to enhance our perception of the physical space, e.g., convex mirrors are used to increase our field of view in the street, and concave mirrors are used to zoom in on parts our face in the bathroom. In this paper, we investigate the opportunities opened when these mirrors are made dynamic, so that their effects can be modulated to adapt to the environment or to a users actions. We introduce the concept of dynamic mirror brushes that can be moved around a mirror surface. We describe how these brushes can be used for various optical manipulations of the physical space. We also present an implementation using a flexible mirror sheet and three scenarios that demonstrate some of the interaction opportunities.