Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mike Fraser is active.

Publication


Featured researches published by Mike Fraser.


conference on computer supported cooperative work | 1998

Fragmented interaction: establishing mutual orientation in virtual environments

Jon Hindmarsh; Mike Fraser; Christian Heath; Steve Benford; Chris Greenhalgh

~s paper explores and evaluates the support for objectfocused cotiaboration provided by a destiop CoUaborative Virtual Environment me system was usd to support an experimental ‘design’ task. Video recordings of the participants’ activities factitated an observationrd analysis of interaction in, and through, the virtual world. Observations include problems due to tigmented views of embodiments in relation to shared objects; participants compensating with spoken accounts of their actions; and difficulties in understanding otiers’ perspectives. Design impXcations include: more expficit representations of actions than are provided by pseudo-humanoid embodiments; and navigation techniques that are sensitive to the actions of others.


human factors in computing systems | 2002

The augurscope: a mixed reality interface for outdoors

Holger Schnädelbach; Boriana Koleva; Martin Flintham; Mike Fraser; Shahram Izadi; Paul Chandler; Malcolm Foster; Steve Benford; Chris Greenhalgh; Tom Rodden

The augurscope is a portable mixed reality interface for outdoors. A tripod-mounted display is wheeled to different locations and rotated and tilted to view a virtual environment that is aligned with the physical background. Video from an onboard camera is embedded into this virtual environment. Our design encompasses physical form, interaction and the combination of a GPS receiver, electronic compass, accelerometer and rotary encoder for tracking. An initial application involves the public exploring a medieval castle from the site of its modern replacement. Analysis of use reveals problems with lighting, movement and relating virtual and physical viewpoints, and shows how environmental factors and physical form affect interaction. We suggest that problems might be accommodated by carefully constructing virtual and physical content


human-robot interaction | 2010

Cooperative gestures: effective signaling for humanoid robots

Laurel D. Riek; Tal-Chen Rabinowitch; Paul Bremner; Anthony G. Pipe; Mike Fraser; Peter Robinson

Cooperative gestures are a key aspect of human-human pro-social interaction. Thus, it is reasonable to expect that endowing humanoid robots with the ability to use such gestures when interacting with humans would be useful. However, while people are used to responding to such gestures expressed by other humans, it is unclear how they might react to a robot making them. To explore this topic, we conducted a within-subjects, video based laboratory experiment, measuring time to cooperate with a humanoid robot making interactional gestures. We manipulated the gesture type (beckon, give, shake hands), the gesture style (smooth, abrupt), and the gesture orientation (front, side). We also employed two measures of individual differences: negative attitudes toward robots (NARS) and human gesture decoding ability (DANVA2-POS). Our results show that people cooperate with abrupt gestures more quickly than smooth ones and front-oriented gestures more quickly than those made to the side, peoples speed at decoding robot gestures is correlated with their ability to decode human gestures, and negative attitudes toward robots is strongly correlated with a decreased ability in decoding human gestures.


user interface software and technology | 2011

Visual separation in mobile multi-display environments

Jessica R. Cauchard; Markus Löchtefeld; Pourang Irani; J. Schoening; Antonio Krüger; Mike Fraser; Sriram Subramanian

Projector phones, handheld game consoles and many other mobile devices increasingly include more than one display, and therefore present a new breed of mobile Multi-Display Environments (MDEs) to users. Existing studies illustrate the effects of visual separation between displays in MDEs and suggest interaction techniques that mitigate these effects. Currently, mobile devices with heterogeneous displays such as projector phones are often designed without reference to visual separation issues; therefore it is critical to establish whether concerns and opportunities raised in the existing MDE literature apply to the emerging category of Mobile MDEs (MMDEs). This paper investigates the effects of visual separation in the context of MMDEs and contrasts these with fixed MDE results, and explores design factors for Mobile MDEs. Our study uses a novel eye-tracking methodology for measuring switches in visual context between displays and identifies that MMDEs offer increased design flexibility over traditional MDEs in terms of visual separation. We discuss these results and identify several design implications.


human factors in computing systems | 2001

Exploiting interactivity, influence, space and time to explore non-linear drama in virtual worlds

Michael P. Craven; Ian Taylor; Adam Drozd; Jim Purbrick; Chris Greenhalgh; Steve Benford; Mike Fraser; John Bowers; Kai-Mikael Jää-Aro; Bernd Lintermann; Michael Hoch

We present four contrasting interfaces to allow multiple viewers to explore 3D recordings of dramas in on-line virtual worlds. The first is an on-line promenade performance to an audience of avatars. The second is a form of immersive cinema, with multiple simultaneous viewpoints. The third is a tabletop projection surface that allows viewers to select detailed views from a birds-eye overview. The fourth is a linear television broadcast created by a director or editor. A comparison of these examples shows how a viewing audience can exploit four general resources - interactivity, influence, space, and time - to make sense of complex, non-linear virtual drama. These resources provide interaction designers with a general framework for defining the relationship between the audience and the 3D content.


CVE | 2001

Virtually Missing the Point: Configuring CVEs for Object-Focused Interaction

Jon Hindmarsh; Mike Fraser; Christian Heath; Steve Benford

In this chapter, we focus on collaborative virtual reality systems that use a combination of 3D graphics and audio to enable people to interact within a virtual setting and to discuss, fashion and manipulate their common environment. The emergence of these systems introduces unique opportunities to develop new sites of sociality and to support distributed collaborative work and interaction. Indeed, as 3D visualizations and single-user VR technologies become increasingly adopted within numerous industrial and entertainment domains (Schroeder, 1996; Stanney et al., 1998), there are heightened opportunities for associated collaborative applications to emerge (Biocca and Levy, 1995). Most successful thus far have been the entertainment applications, such as online gaming, inhabited TV, artistic installations and museum exhibits (see Benford et al., 1997a,b, 1999a; Greenhalgh et al., 1999b,c; and Chapter 5 of this volume). However, here we are particularly concerned with the potential for collaborative VR to provide support for remote working - a relatively under-explored area.


collaborative virtual environments | 2002

Staging and evaluating public performances as an approach to CVE research

Steve Benford; Mike Fraser; Gail Reynard; Boriana Koleva; Adam Drozd

Staging public performances can be a fruitful approach to CVE research. We describe four experiences: Out of This World, a gameshow; Avatar Farm, a participatory drama; Desert Rain, a mixed reality performance; and Can You See Me Now?, a game that mixed on-line players with players on the streets. For each, we describe how a combination of ethnography, audience feedback and analysis of system logs led to new design insights, especially in the areas of orchestration and making activity available to viewers. We propose enhancing this approach with new tools for manipulating, analysing and sharing 3D recordings of CVEs.


conference on computer supported cooperative work | 2006

Remote Collaboration Over Video Data: Towards Real-Time e-Social Science

Mike Fraser; Jon Hindmarsh; Katie Best; Christian Heath; Greg Biegel; Chris Greenhalgh; Stuart Reeves

The design of distributed systems to support collaboration among groups of scientists raises new networking challenges that grid middleware developers are addressing. This field of development work, ‘e-Science’, is increasingly recognising the critical need of understanding the ordinary day-to-day work of doing research to inform design. We have investigated one particular area of collaborative social scientific work – the analysis of video data. Based on interviews and observational studies, we discuss current practices of social scientific work with digital video in three areas: Preparation for collaboration; Control of data and application; and Annotation configurations and techniques. For each, we describe how these requirements feature in our design of a distributed video analysis system as part of the MiMeG project: our security policy and distribution; the design of the control system; and providing freeform annotation over data. Finally, we review our design in light of initial use of the software between project partners; and discuss how we might transform the spatial configuration of the system to support annotation behaviour.


ubiquitous computing | 2002

Citywide: Supporting Interactive Digital Experiences Across Physical Space

Shahram Izadi; Mike Fraser; Steve Benford; Martin Flintham; Chris Greenhalgh; Tom Rodden; Holger Schnädelbach

Abstract: The Citywide project is exploring ways in which technology can provide people with rich and engaging digital experiences as they move through physical space, including historical experiences, performances and games. This paper describes some initial results and experiences with this project based upon two prototype demonstrators. In the first, we describe an application in which a search party explores an archaeological site, uncovering enacted scenes within the virtual world that are of a historical relevance to their particular physical location. In the second, we describe a museum experience where participants explore an outdoors location, hunting for buried virtual artifacts that they then bring back to a museum for a more detailed study. Our demonstrators employ a varied set of devices, including mobile wireless interfaces for locating hotspots of virtual activity when outdoors, to give different experiences of the virtual world depending upon location, task, available equipment and accuracy of tracking. We conclude by discussing some of the potential advantages of using an underlying shared virtual world to support interactive experiences across extended physical settings.


ubiquitous computing | 2012

Steerable projection: exploring alignment in interactive mobile displays

Jessica R. Cauchard; Mike Fraser; Teng Han; Sriram Subramanian

Emerging smartphones and other handheld devices are now being fitted with a set of new embedded technologies such as pico-projection. They are usually designed with the pico-projector embedded in the top of the device. Despite the potential of personal mobile projection to support new forms of interactivity such as augmented reality techniques, these devices have not yet made significant impact on the ways in which mobile data is experienced. We suggest that this ‘traditional’ configuration of fixed pico-projectors within the device is unsuited to many projection tasks because it couples the orientation of the device to the management of the projection space, preventing users from easily and simultaneously using the mobile device and looking at the projection. We present a study which demonstrates this problem and the requirement for steerable projection behaviour and some initial users’ preferences for different projection coupling angles according to context. Our study highlights the importance of flexible interactive projections which can support interaction techniques on the device and on the projection space according to task. This inspires a number of interaction techniques that create different personal and shared interactive display alignments to suit a range of different mobile projection situations.

Collaboration


Dive into the Mike Fraser's collaboration.

Top Co-Authors

Avatar

Steve Benford

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Boriana Koleva

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tom Rodden

University of Nottingham

View shared research outputs
Researchain Logo
Decentralizing Knowledge