Martin Naef
Glasgow School of Art
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Martin Naef.
symposium on 3d user interfaces | 2006
Paul Keir; John Payne; Jocelyn Elgoyhen; Martyn Horner; Martin Naef; Paul Anderson
This paper presents 3motion, a novel 3D gesture interaction system consisting of a low-cost, lightweight hardware component and a general-purpose software development kit. The system provides gesture-based 3D interaction for situations where traditional tracking systems are too expensive or impractical due to the calibration and reference source requirements. The hardware component is built around a 3-axis linear accelerometer chip and transmits a continuous data stream to a host device via a wireless Bluetooth link. The software component receives this data and matches it against a library of 3D gestures to trigger actions. The system has been validated extensively with various example applications, including a “Battle of the Wizards” game, a character manipulation demonstrator, and a golf game implemented on a mobile phone.
human factors in computing systems | 2006
John Payne; Paul Keir; Jocelyn Elgoyhen; Mairghread McLundie; Martin Naef; Martyn Horner; Paul Anderson
We describe preliminary tests that form the first phase of research into issues involved with the design of spatial 3D gestures for video games. Early research on 3D gesture and spatial interaction was largely the domain of Virtual Reality (VR) [1]. More recent work looks at 3D gestures for mobile devices [2] and pervasive computing [3]. We are investigating issues affecting usability and fun [4] in the context of 3D gestures and spatial movement in video games where emotion, immediacy and immersion are more important than breadth of functionality and user task efficiency. These tests use our 3motion™ system, a wireless inertial motion tracking device and gesture SDK. This enables a range of gesture types from tight, precise movements to whole arm gestures, and from direct mapping of movement to recognition of 3D symbolic gestures. Four game scenarios using different spatial gesture characteristics were used to identify gameplay issues that have an impact on the design of 3D interaction.
ieee virtual reality conference | 2008
Vassilis Charissis; B. M. Ward; Martin Naef; D. Rowley; L. Brady; P. Anderson
This paper presents an initial study exploring and evaluating a novel, accessible and user-centred interface developed for a VR Medical training environment. In particular, the proposed system facilitates a detailed 3D information exchange, with the aim of improving the users internal 3D understanding and visualisation of complex anatomical inter-relationships. In order to evaluate the effectiveness of the proposed VR teaching method we developed a female 3D model under the guidance of Consultant Breast surgeons with particular emphasis given on the axilla section. In turn we commenced a comparative study between PBL tutorials augmented with VR and the contemporary teaching techniques involving twelve participants. Overall the paper outlines the development process of the proposed VR Medical Training environment, discusses the results from the comparative study, and offers suggestions for further research and a tentative plan for future work.
international conference on human computer interaction | 2007
Vassilis Charissis; Martin Naef; Stylianos Papanastasiou; Marianne Patera
This paper introduces a novel design approach for an automotive direct manipulation interface. The proposed design, as applied in a full-windshield Head-Up Display system, aims to improve the drivers situational awareness by considering information as it becomes available from various sources such as incoming mobile phone calls, text and email messages. The vehicles windshield effectively becomes an interactive display area which allows the system to increase the quality as well as throttle the quantity of information distilled to the driver in typical driving situations by utilising the existing mobile phone network. Opting for a simplistic approach of interaction, the interface elements are based on minimalist visual representation of real objects. This paper discusses the challenges involved in the HUD design, introduces the visual components of the interface and presents the outcome of a preliminary evaluation of the system on a group of ten users, as performed using a driving simulator.
symposium on 3d user interfaces | 2007
Martin Naef; John Payne
This paper summarizes the experience drawn from designing and revising a design review application prototype interface using immersive virtual reality technology and putting it into context with previous research in the field of 3D human-computer interaction. AutoEval was originally developed in collaboration with a major car manufacturer to enable intuitive analysis and manipulation of 3D models for users without a CAD or computer science background. This paper introduces the system and discusses the 3D interaction design decisions taken based on the observation and informal feedback of a large number of users
symposium on 3d user interfaces | 2011
Martin Naef; Ettore Ferranti
This poster presents a multi-touch navigation interface for a building energy management system with a three-dimensional data model. It extends well established “rubber-band” 2D interaction gestures to work with a 3D world-in-hand paradigm with the help of a navigation widget to select the active manipulation axis. A nested, semi-transparent display of the data hierarchy requires careful selection of the manipulation pivot. A hit-testing scheme is introduced to select the most likely object within the hierarchy.
acm multimedia | 2008
Martin Naef; Cathie Boyd
The Living Canvas initiative aims to use a performer on stage as a dynamic projection surface. Using machine vision in the near-infrared spectrum enables the system to follow and adapt to the performer, restricting projection to the silhouette. Ultimately, the system aims to create the illusion of a completely dynamic costume. This paper introduces the concept and presents an implementation and analysis of the performance-critical stages of the projection pipeline, proving the feasibility of the idea as well as analysing the limitations introduced by current digital projection technology. Bringing together the research from computer graphics and machine vision with the artistic vision and guidance from Cryptic, the initiative aims to create and explore a new expressive medium by taking projection systems on stage to a highly interactive level and providing a powerful new tool for live video artists.
international conference on computer graphics and interactive techniques | 2007
Martin Naef
The Living Canvas initiative aims to explore the novel artistic possibilities of using the performer’s body and clothes as a projection surface in the context of a stage performance. A new projection system will enable a dynamic or even improvised performance by detecting the posture and silhouette of the performer and projecting imagery precisely to the selected parts of body. This will enable the performer to “wear virtual costumes” that adapt to the body, or even receive a different face. The dynamic nature of the system will give full control to the performer who can freely move around on the stage, with the projection always “following” the performer. The Living Canvas is a collaborative initiative between the Glasgow School of Art and Theatre Cryptic and has acquired funding from the UK Arts and Humanities Research Council to implement the vision.
international conference on computer graphics and interactive techniques | 2005
John Payne; Paul Keir; Jocelyn Elgoyhen; Tom Kenny; Martin Naef
The 3motion system enables the recognition of gestures based on 3D trajectories in space. It consists of both a software and a hardware component. The Software Development Kit (SDK) enables programmers to implement and control how they want to recognise movements in space such as a punch, a golf swing, a baseball pitch or even a dance move. Although the SDK can work with any positioning device from a 2D mouse to a 6 degree of freedom tracker we have developed our own low-cost hardware to provide wide-range wireless functionality. The main innovation of the 3motion system is the combination of a 3D curve matching algorithm with acceleration signatures from inexpensive inertial sensors.
Interactive Learning Environments | 2008
Marianne Patera; Steve Draper; Martin Naef