Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Euan Freeman is active.

Publication


Featured researches published by Euan Freeman.


international conference on multimodal interfaces | 2014

Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions

Euan Freeman; Stephen A. Brewster; Vuokko Lantz

Above-device gesture interfaces let people interact in the space above mobile devices using hand and finger movements. For example, users could gesture over a mobile phone or wearable without having to use the touchscreen. We look at how above-device interfaces can also give feedback in the space over the device. Recent haptic and wearable technologies give new ways to provide tactile feedback while gesturing, letting touchless gesture interfaces give touch feedback. In this paper we take a first detailed look at how tactile feedback can be given during above-device interaction. We compare approaches for giving feedback (ultrasound haptics, wearables and direct feedback) and also look at feedback design. Our findings show that tactile feedback can enhance above-device gesture interfaces.


human computer interaction with mobile devices and services | 2014

Towards usable and acceptable above-device interactions

Euan Freeman; Stephen A. Brewster; Vuokko Lantz

Gestures above a mobile phone would let users interact with their devices quickly and easily from a distance. While both researchers and smartphone manufacturers develop new gesture sensing technologies, little is known about how best to design these gestures and interaction techniques. Our research looks at creating usable and socially acceptable above-device interaction techniques. We present an initial gesture collection, a preliminary evaluation of these gestures and some design recommendations. Our findings identify interesting areas for future research and will help designers create better gesture interfaces.


human factors in computing systems | 2013

Messy tabletops: clearing up the occlusion problem

Euan Freeman; Stephen A. Brewster

When introducing interactive tabletops into the home and office, lack of space will often mean that these devices play two roles: interactive display and a place for putting things. Clutter on the table surface may occlude information on the display, preventing the user from noticing it or interacting with it. We present a technique for dealing with clutter on tabletops which finds a suitable unoccluded area of the display in which to show content. We discuss the implementation of this technique and some design issues which arose during implementation.


international symposium on pervasive displays | 2017

Levitate: interaction with floating particle displays

Julie Rico Williamson; Euan Freeman; Stephen A. Brewster

This demonstration showcases the current state of the art for the levitating particle display from the Levitate Project. In this demonstration, we show a new type of display consisting of floating voxels, small levitating particles that can be positioned and moved independently in 3D space. Phased ultrasound arrays are used to acoustically levitate the particles. Users can interact directly with each particle using pointing gestures. This allows users to walk-up and interact without any user instrumentation, creating an exciting opportunity to deploy these tangible displays in public spaces in the future. This demonstration explores the design potential of floating voxels and how these may be used to create new types of user interfaces.


human factors in computing systems | 2017

Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently

Euan Freeman; Graham A. Wilson; Stephen A. Brewster; Gabriel Baud-Bovy; Charlotte Magnusson; Héctor A. Caltenco

Young children with visual impairments tend to engage less with their surroundings, limiting the benefits from activities at school. We investigated novel ways of using sound from a bracelet, such as speech or familiar noises, to tell children about nearby people, places and activities, to encourage them to engage more during play and help them move independently. We present a series of studies, the first two involving visual impairment educators, that give insight into challenges faced by visually impaired children at school and how sound might help them. We then present a focus group with visually impaired children that gives further insight into the effective use of sound. Our findings reveal novel ways of combining sounds from wearables with sounds from the environment, motivating audible beacons, devices for audio output and proximity estimation. We present scenarios, findings and a design space that show the novel ways such devices could be used alongside wearables to help visually impaired children at school.


international conference on multimodal interfaces | 2016

Multimodal affective feedback: combining thermal, vibrotactile, audio and visual signals

Graham A. Wilson; Euan Freeman; Stephen A. Brewster

In this paper we describe a demonstration of our multimodal affective feedback designs, used in research to expand the emotional expressivity of interfaces. The feedback leverages inherent associations and reactions to thermal, vibrotactile, auditory and abstract visual designs to convey a range of affective states without any need for learning feedback encoding. All combinations of the different feedback channels can be utilised, depending on which combination best conveys a given state. All the signals are generated from a mobile phone augmented with thermal and vibrotactile stimulators, which will be available to conference visitors to see, touch, hear and, importantly, feel.


human factors in computing systems | 2013

Designing a smartpen reminder system for older adults

Julie Rico Williamson; Marilyn Rose McGee-Lennon; Euan Freeman; Stephen A. Brewster

Designing interactive systems for older adults often means designing with older adults from the earliest stages of development. This paper describes the co-design of a smartpen and paper calendar-based reminder system for the home. The design sessions involved older adults and used experience prototypes [1]. We completed these co-design sessions with older adults in order to explore the possibility of exploiting paper-based calendars for multimodal reminders systems using a smartpen. The initial results demonstrate successful interaction techniques that make a strong link between paper interaction and scheduling reminders, such as using smartpen annotations and using the location of written reminders within a paper diary to schedule digital reminders. The results also describe important physical aspects of paper diaries as discussed by older adults, such as daily/weekly layouts and binding.


Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces | 2017

Floating Widgets: Interaction with Acoustically-Levitated Widgets

Euan Freeman; Ross Gilbert Anderson; Carl Andersson; Julie Rico Williamson; Stephen A. Brewster

Acoustic levitation enables new types of human-computer interface, where the content that users interact with is made up from small objects held in mid-air. We show that acoustically-levitated objects can form mid-air widgets that respond to interaction. Users can interact with them using in-air hand gestures. Sound and widget movement are used as feedback about the interaction.


international conference on human-computer interaction | 2015

Interactive Light Feedback: Illuminating Above-Device Gesture Interfaces

Euan Freeman; Stephen A. Brewster; Vuokko Lantz

In-air hand gestures allow users to interact with mobile phones without reaching out and touching them. Users need helpful and meaningful feedback while they gesture, although mobile phones have limited feedback capabilities because of their small screen sizes. Interactive light feedback illuminates the surface surrounding a mobile phone, giving users visual feedback over a larger area and without affecting on-screen content. We explore the design space for interactive light and our demonstration shows how we can use this output modality for gesture feedback.


human factors in computing systems | 2018

Point-and-Shake: Selecting from Levitating Object Displays

Euan Freeman; Julie Rico Williamson; Sriram Subramanian; Stephen A. Brewster

Acoustic levitation enables a radical new type of human-computer interface composed of small levitating objects. For the first time, we investigate the selection of such objects, an important part of interaction with a levitating object display. We present Point-and-Shake, a mid-air pointing interaction for selecting levitating objects, with feedback given through object movement. We describe the implementation of this technique and present two user studies that evaluate it. The first study found that users could accurately (96%) and quickly (4.1s) select objects by pointing at them. The second study found that users were able to accurately (95%) and quickly (3s) select occluded objects. These results show that Point-and-Shake is an effective way of initiating interaction with levitating object displays.

Collaboration


Dive into the Euan Freeman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John Hamer

University of Auckland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge