Phil L. Watten
University of Sussex
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Phil L. Watten.
international conference on smart homes and health telematics | 2010
Patrick Holroyd; Phil L. Watten; Paul Newbury
Although the idea of the smart home has been around for over three decades the smart technology that enables it has yet to reach the mass market. Spending on smart technology is expected to rise, but it is still negligible when compared to overall spending on consumer electronics. This paper examines the benefits of the smart home, peoples attitude towards them and smart technologies and the possible reasons for lack of interest and adoption of such technologies.
international conference on computer graphics and interactive techniques | 1997
Jon P. Ewins; Phil L. Watten; Martin White; M. D. J. McNeill; Paul F. Lister
The design of a hardware architecture for a computer graphics pipeline requires a thorough understanding of the algorithms involved at each stage, and the implications these algorithms have on the organisation of the pipeline architecture. The choice of algorithm, the flow of pixel data through the pipeline, and bit width precision issues are crucial decisions in the design of new hardware accelerators. Making these decisions correctly requires intensive investigation and experimentation. The use of hardware description languages such as VHDL, allow for sound top down design methodologies, but their effectiveness in such experimental work is limited. This paper discusses the use of software tools as an aid to hardware development and presents applications that demonstrate the possibilities of this approach and the benefits that can be attained from an integrated codesign design environment. CR
Appetite | 2015
Louise Brunger; Adam Smith; Roberta Re; Martin Wickham; Andrew Philippides; Phil L. Watten; Martin R. Yeomans
The study aimed to validate appetite ratings made on a new electronic device, the Apple iPad Mini, against an existing but now obsolete electronic device (Hewlett Packard iPAQ). Healthy volunteers (9 men and 9 women) rated their appetite before and 0, 30, 60, 90 and 120 minutes after consuming both a low energy (LE: 77 kcal) and high energy (HE: 274 kcal) beverage at breakfast on 2 non-consecutive days in counter-balanced order. Rated hunger, desire to eat and how much participants could consume was significantly lower after HE than LE on both devices, although there was better overall differentiation between HE and LE for ratings on iPad. Rated satiation and fullness, and a composite measure combining all five ratings, was significantly higher after HE than LE on both devices. There was also evidence that differences between conditions were more significant when analysed at each time point than using an overall area under the curve (AUC) measure. Overall, these data confirm that appetite ratings made using iPad are at least as sensitive as those on iPAQ, and offer a new platform for researchers to collect appetite data.
international conference on computer graphics and interactive techniques | 2006
Maria Sifniotis; Ben Jackson; Martin White; Katerina Mania; Phil L. Watten
For the past decade, 3D archaeological visualisations have mostly been representing photo-realistic reconstructions of ancient monuments. While these can be constructive in a museum or tourist context, the archaeological community has long stressed the need for reconstructions showing where the actual remains end and the assumptions begin. Recent attempts to implement the latter approach are either limited to found/not found scenarios or marking of uncertain areas without any justification to the choice of colour/hue degradation etc. As a result, there is no system to represent the uncertainty involved in visualising archaeological data. The archaeologist interprets a site based on a limited amount of material remains and uses comparative evidence from other sites, written references, as well as speculation in order to create a reconstruction. Due to this varied range of data, he/she may have different levels of certainty on some areas of the reconstruction than others. If we are able to observe this uncertainty on the visualisation itself, it would provide us with a whole new range of uses for archaeological models, such as learning about archaeological hypotheses, comparing uncertainties across different models and highlighting cases where further research may be required.
Proceedings Theory and Practice of Computer Graphics, 2004. | 2004
Paul F. Lister; Phil L. Watten; Martin R. Lewis; Paul Newbury; Martin White; Milke C. Bassett; Ben Jackson; Vincenzo Trignano
This paper examines a virtual prototyping system for electronic devices which incorporate visualisation using a novel integrated development environment that combines user interaction with photorealistic 2D and 3D models. Full system level hardware simulation is also supported within this framework which offers electronic simulation in a virtual environment. This helps to link product development specialists with a unified and coherent modelling environment. Virtual prototyping is a novel design methodology that aims to decrease the time-to-market and increase product reliability, quality and fulfilment of user requirements. This paper uses the example of a remotely controlled domestic cooking system to illustrate this process
eurographics | 2014
Marco Gilardi; Phil L. Watten; Paul Newbury
Surfaces covered with pebbles and small rocks can often be found in nature or in human shaped environments. Generating an accurate three-dimensional model of those kind of surfaces from a reference image can be challenging, especially if one wants to be able to animate each pebble individually. To undertake this kind of job manually is time consuming and impossible to achieve in dynamic terrains animations. The method described in this paper allows unsupervised automatic generation of three-dimensional textured rocks from a two-dimensional image aiming to closely match the original image as much as possible.
international conference on computer graphics and interactive techniques | 2010
Fiona M. Rivera; Phil L. Watten; Patrick Holroyd; Felix D.C.C. Beacher; Katerina Mania; Hugo D. Critchley
This research concentrates on providing high fidelity animation, only achievable with offline rendering solutions, for interactive fMRI-based experiments. Virtual characters are well established within the film, game and research worlds, yet much remains to be learned about which design, stylistic or behavioural factors combine to make a believable character. The definition of believability depends on context. When designing and implementing characters for entertainment, the concern is making believable characters that the audience will engage with. When using virtual characters in experiments, the aim is to create characters and synthetic spaces that people respond to in a similar manner to their real world counterparts. Research has shown that users show empathy for virtual characters. However, uncanny valley effects -- ie dips in user impressions -- can arise: behavioural fidelity expectations increase alongside increases in visual fidelity and vice versa. Often, characters used within virtual environments tend to be of fairly low fidelity due to technological constraints including rendering in real-time (Garau et al. 2003). This problem is addressed here by using non-linear playback and compositing of pre-rendered high fidelity sequences.
applied perception in graphics and visualization | 2010
Maria Sifniotis; Ben Jackson; Katerina Mania; N. Vlassis; Phil L. Watten; Martin White
By uncertainty, we define an archaeological experts level of confidence in an interpretation deriving from gathered evidence. Archaeologists and computer scientists have urged caution in the use of 3D for archaeological reconstructions because the availability of other possible hypotheses is not always being acknowledged. This poster presents a 3D visualization system of archaeological uncertainty.
forum on specification and design languages | 2005
Paul F. Lister; Vincenzo Trignano; Mike C. Bassett; Phil L. Watten
This paper presents the use of UML-Executable Functional Models (UMLEFM) in the context of the ViPERS virtual prototyping methodology [tikyaLister et al., 2004a, Lister et al., 2004b] for System-on-Chip design. The concepts, the implementation and the experiments presented in this paper were developed at the University of Sussex (UoS) in the Centre of VLSI and Computer Graphics as part of an EU project [VIPERS]. The ViPERS methodology and its employment of the executable functional models have been developed to face the contemporary challenges of System-On-Chips by integrating key design methodologies with the graphical and interactive features of virtual prototyping. The fast evolution in silicon technology and its consequences on the market of hand held electronic products, is making the adoption of new design methodologies mandatory, with modern techniques for the design, development and manufacturing of consumer electronics. Executable functional models provide a means to simulate the target device in different phases of the design flow and analyse its requirements (behaviours, interfaces, etc), architecture (HW/SW partitioning) and finally its digital implementation. A key contribution includes the combination of an interactive 2D photorealistic model with its functional executable model implemented as a UML state machine; the experiment is applied to an RF home-based remote control used to control a cooking stack.
human factors in computing systems | 2016
Marco Gilardi; Patrick Holroyd; Carly Brownbridge; Phil L. Watten; Marianna Obrist
The use of films in early stages of the design of technology is a practice that is becoming increasingly common. However, the focus of these films is usually centered on exploring the technology and its specifications rather than on the experiences that the technology can potentially create for its user. Previous research emphasises the relevance of experiences created by the technology in the users arguing that the emotions should be taken into account during early design stages and made part of the design itself. In this paper we provide a step-by-step production pipeline on how to make your own design fiction film, and how you can get the experiences across. For this purpose we focus on the experiences and emotions that a specific interaction medium elicits. We gained inspiration from the increased exploration of olfactory experiences in HCI. We used a classification of smell experiences as a starting point to produce a design fiction film for the automotive context, not limited by technology but inspired by experiences.