Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Duncan Rowland is active.

Publication


Featured researches published by Duncan Rowland.


Nature | 1997

A specific neural substrate for perceiving facial expressions of disgust

Mary L. Phillips; Andrew W. Young; Carl Senior; Michael Brammer; C Andrew; Andrew J. Calder; Edward T. Bullmore; David I. Perrett; Duncan Rowland; Steven Williams; Jeffrey A. Gray; Anthony S. David

Recognition of facial expressions is critical to our appreciation of the social and physical environment, with separate emotions having distinct facial expressions. Perception of fearful facial expressions has been extensively studied, appearing to depend upon the amygdala. Disgust — literally ‘bad taste’ — is another important emotion, with a distinct evolutionary history, and is conveyed by a characteristic facial expression. We have used functional magnetic resonance imaging (fMRI) to examine the neural substrate for perceiving disgust expressions. Normal volunteers were presented with faces showing mild or strong disgust or fear. Cerebral activation in response to these stimuli was contrasted with that for neutral faces. Results for fear generally confirmed previous positron emission tomography findings of amygdala involvement. Both strong and mild expressions of disgust activated anterior insular cortex but not the amygdala; strong disgust also activated structures linked to a limbic cortico–striatal–thalamic circuit. The anterior insula is known to be involved in responses to offensive tastes. The neural response to facial expressions of disgust in others is thus closely related to appraisal of distasteful stimuli.


Cognition | 1997

Facial expression megamix : Tests of dimensional and category accounts of emotion recognition

Andrew W. Young; Duncan Rowland; Andrew J. Calder; Nancy L. Etcoff; Anil K. Seth; David I. Perrett

We report four experiments investigating the perception of photographic quality continua of interpolated (morphed) facial expressions derived from prototypes of the 6 emotions in the Ekman and Friesen (1976) series (happiness, surprise, fear, sadness, disgust and anger). In Experiment 1, morphed images made from all possible pairwise combinations of expressions were presented in random order; subjects identified these as belonging to distinct expression categories corresponding to the prototypes at each end of the relevant continuum. This result was replicated in Experiment 2, which also included morphs made from a prototype with a neutral expression, and allowed neutral as a response category. These findings are inconsistent with the view that facial expressions are recognised by locating them along two underlying dimensions, since such a view predicts that at least some transitions between categories should involve neutral regions or identification as a different emotion. Instead, they suggest that facial expressions of basic emotions are recognised by their fit to discrete categories. Experiment 3 used continua involving 6 emotions to demonstrate best discrimination of pairs of stimuli falling across category boundaries; this provides further evidence of categorical perception of facial expressions of emotion. However, in both Experiment 1 and Experiment 2, reaction time data showed that increasing distance from the prototype had a definite cost on ability to identify emotion in the resulting morphed face. Moreover, Experiment 4 showed that subjects had some insight into which emotions were blended to create specific morphed images. Hence, categorical perception effects were found even though subjects were sensitive to physical properties of these morphed facial expressions. We suggest that rapid classification of prototypes and better across boundary discriminability reflect the underlying organisation of human categorisation abilities.


Evolution and Human Behavior | 1999

Symmetry and human facial attractiveness

David I. Perrett; D. Michael Burt; Ian S. Penton-Voak; Kieran J Lee; Duncan Rowland; Rachel Edwards

Abstract Symmetry may act as a marker of phenotypic and genetic quality and is preferred during mate selection in a variety of species. Measures of human body symmetry correlate with attractiveness, but studies manipulating human face images report a preference for asymmetry. These results may reflect unnatural feature shapes and changes in skin textures introduced by image processing. When the shape of facial features is varied (with skin textures held constant), increasing symmetry of face shape increases ratings of attractiveness for both male and female faces. These findings imply facial symmetry may have a positive impact on mate selection in humans.


Visual Cognition | 1996

Categorical Perception of Morphed Facial Expressions

Andrew J. Calder; Andrew W. Young; David I. Perrett; Nancy L. Etcoff; Duncan Rowland

Using computer-generated line-drawings, Etcoff and Magee (1992) found evidence of categorical perception of facial expressions. We report four experiments that replicated and extended Etcoff and Magees findings with photographic-quality stimuli. Experiments 1 and 2 measured identification of the individual stimuli falling along particular expression continua (e.g. from happiness to sadness) and discrimination of these stimuli with an ABX task in which stimuli A, B, and X were presented sequentially; subjects had to decide whether X was the same as A or B. Our identification data showed that each expression continuum was perceived as two distinct sections separated by a category boundary. From these identification data we were able to predict subjects performance in the ABX discrimination task and to demonstrate better discrimination of cross-boundary than within-category pairs; that is, two faces identified as different expressions (e.g. happy and sad) were easier to discriminate than two faces of equal...


IEEE Computer Graphics and Applications | 1995

Manipulating facial appearance through shape and color

Duncan Rowland; David I. Perrett

A technique for defining facial prototypes is described which supports transformations along quantifiable dimensions in face space. Examples illustrate the use of shape and color information to perform predictive gender and age transformations. The processes we describe begin with the creation of a facial prototype. Generally, a prototype can be defined as a representation containing the consistent attributes across a class of objects. Once we obtain a class prototype, we can take an exemplar that has some information missing and augment it with the prototypical information. In effect, this adds in the average values for the missing information. We use this notion to transform gray-scale images into full color by including the color information from a relevant prototype. It is also possible to deduce the difference between two groups within a class. Separate prototypes can be formed for each group. These can be used subsequently to define a transformation that will map instances from one group onto the domain of the other. This paper details the procedure we use to transform facial images and shows how it can be used to alter perceived facial attributes. >Human faces are perceived to differ along many dimensions. Some dimensions can be defined by objectively different categories such as old/young or male/female but others reflect more subjective cat...


Cognitive Neuropsychology | 1997

Recognition of Facial Expressions: Selective Impairment of Specific Emotions in Huntington's Disease

Reiner Sprengelmeyer; Andrew W. Young; Anke Sprengelmeyer; Andrew J. Calder; Duncan Rowland; David I. Perrett; Volker Hömberg; Herwig W. Lange

Recognition of facial expressions of basic emotions was investigated in HL and UJ, two people with Huntingtons disease who showed little evidence of general cognitive deterioration. No impairments were found in tests of the perception of age, sex, familiar face identity, unfamiliar face identity, and gaze direction, indicating adequate processing of the face as a physical stimulus. Computer-interpolated (morphed) images of facial expressions of basic emotions were used to demonstrate deficits in the recognition of disgust and fear for UJ. HL also showed a deficit in the recognition of disgust, and was not very adept (but not significantly impaired) at recognising fear. Other basic emotions (happiness, surprise, sadness, anger) were recognised atnormal levels of performance by HL and UJ. These results show that impairments of emotion recognition can be circumscribed; affecting some emotions more than others, and occurring in people who do not show pronounced perceptual or intellectual deterioration. Que...


Proceedings of the Royal Society of London B: Biological Sciences | 1997

Computer-enhanced emotion in facial expressions

Andrew J. Calder; Andrew W. Young; Duncan Rowland; David I. Perrett

Benson and Perretts (1991b) computer–based caricature procedure was used to alter the positions of anatomical landmarks in photographs of emotional facial expressions with respect to their locations in a reference norm face (e.g. a neutral expression). Exaggerating the differences between an expression and its norm produces caricatured images, whereas reducing the differences produces ‘anti–caricatures’. Experiment 1 showed that caricatured (+50 % different from neutral) expressions were recognized significantly faster than the veridical (0 %, undistorted) expressions. This held for all six basic emotions from the Ekman and Friesen (1976) series, and the effect generalized across different posers. For experiment 2, caricatured (+50 %) and anti–caricatured (−50 %) images were prepared using two types of references norm; a neutral–expression norm, which woud be optimal if facial expression recognition involves monitoring changes in the positioning of underlying facial muscles (excluding neutral) in the Ekman & Friesen (1976) series. The results showed that the caricatured images were identified significantly faster, and the anti–caricatured images significantly slower, than the veridical expressions. Furthermore, the neutral–expression and average–expression norm caricatures produced the same pattern of results.


Cognition | 2000

Caricaturing facial expressions.

Andrew J. Calder; Duncan Rowland; Andrew W. Young; Ian Nimmo-Smith; Jill Keane; David I. Perrett

The physical differences between facial expressions (e.g. fear) and a reference norm (e.g. a neutral expression) were altered to produce photographic-quality caricatures. In Experiment 1, participants rated caricatures of fear, happiness and sadness for their intensity of these three emotions; a second group of participants rated how face-like the caricatures appeared. With increasing levels of exaggeration the caricatures were rated as more emotionally intense, but less face-like. Experiment 2 demonstrated a similar relationship between emotional intensity and level of caricature for six different facial expressions. Experiments 3 and 4 compared intensity ratings of facial expression caricatures prepared relative to a selection of reference norms - a neutral expression, an average expression, or a different facial expression (e.g. anger caricatured relative to fear). Each norm produced a linear relationship between caricature and rated intensity of emotion; this finding is inconsistent with two-dimensional models of the perceptual representation of facial expression. An exemplar-based multidimensional model is proposed as an alternative account.


International Journal of Cosmetic Science | 1999

Prototypes of facial attributes developed through image averaging techniques.

S.S. Hawkins; David I. Perrett; D. M. Burt; Duncan Rowland; R.I. Murahata

Image capture and quantification has proven useful in a variety of scientific applications, for example, biology, medicine, geology, meteorology and forensics. The objective of this research was to utilize this technology to quantify clinical‐ and consumer‐perceivable changes in facial attributes. A panel of expert assessors was trained, and, in a large consumer study, consumer facial attributes were identified and grading scales for each attribute were established. These experts then rated over 240 subjects on a total of 19 different facial attributes. Based on methodology developed by Perrett et al., facial averages or prototypes were computed from panelists rated high or low for each attribute. Prototypes were developed in a 3 step process: 1) selection of 224 predefined feature points; 2) calculation of average face shape; and 3) ’morphing’ individual faces into that shape and blending the images together. Naive assessors could readily appreciate the differences in facial appearance of the prototypes. In addition, expert graders were able to identify the general class of attribute affected. This method provides a powerful tool for assessing the effects of skin care technologies.


Brain | 1996

Loss of disgust. Perception of faces and emotions in Huntington's disease.

Reiner Sprengelmeyer; Andrew W. Young; Andrew J. Calder; Anke Karnat; Herwig W. Lange; Volker Hömberg; David I. Perrett; Duncan Rowland

Collaboration


Dive into the Duncan Rowland's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew J. Calder

Cognition and Brain Sciences Unit

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Herwig W. Lange

University of Düsseldorf

View shared research outputs
Top Co-Authors

Avatar

Volker Hömberg

University of Düsseldorf

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge