Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christopher P. Benton is active.

Publication


Featured researches published by Christopher P. Benton.


Philosophical Transactions of the Royal Society B | 2009

Camouflage and visual perception

Tom Troscianko; Christopher P. Benton; P. George Lovell; David J. Tolhurst; Zygmunt Pizlo

How does an animal conceal itself from visual detection by other animals? This review paper seeks to identify general principles that may apply in this broad area. It considers mechanisms of visual encoding, of grouping and object encoding, and of search. In most cases, the evidence base comes from studies of humans or species whose vision approximates to that of humans. The effort is hampered by a relatively sparse literature on visual function in natural environments and with complex foraging tasks. However, some general constraints emerge as being potentially powerful principles in understanding concealment—a ‘constraint’ here means a set of simplifying assumptions. Strategies that disrupt the unambiguous encoding of discontinuities of intensity (edges), and of other key visual attributes, such as motion, are key here. Similar strategies may also defeat grouping and object-encoding mechanisms. Finally, the paper considers how we may understand the processes of search for complex targets in complex scenes. The aim is to provide a number of pointers towards issues, which may be of assistance in understanding camouflage and concealment, particularly with reference to how visual systems can detect the shape of complex, concealed objects.


Proceedings of the Royal Society of London B: Biological Sciences | 1999

Robust velocity computation from a biologically motivated model of motion perception

Alan Johnston; Peter W. McOwan; Christopher P. Benton

Current computational models of motion processing in the primate motion pathway do not cope well with image sequences in which a moving pattern is superimposed upon a static texture. The use of non–linear operations and the need for contrast normalization in motion models mean that the separation of the influences of moving and static patterns on the motion computation is not trivial. Therefore, the response to the superposition of static and moving patterns provides an important means of testing various computational strategies. Here we describe a computational model of motion processing in the visual cortex, one of the advantages of which is that it is highly resistant to interference from static patterns.


Psychological Science | 2010

Anti-Expression Aftereffects Reveal Prototype-Referenced Coding of Facial Expressions

Andrew L. Skinner; Christopher P. Benton

Adaptation is a powerful experimental technique that has recently provided insights into how people encode representations of facial identity. Here, we used this approach to explore the visual representation of facial expressions of emotion. Participants were adapted to anti-expressions of six facial expressions. The participants were then shown an average face and asked to classify the face’s expression using one of six basic emotion descriptors. Participants chose the emotion matching the anti-expression they were adapted to significantly more often than they chose any other emotion (e.g., if they were adapted to antifear, they classified the emotion on the average face as fear). The strength of this aftereffect of adaptation decreased as the strength of the anti-expression adapter decreased. These findings provide evidence that visual representations of facial expressions of emotion are coded with reference to a prototype within a multidimensional framework.


Journal of Vision | 2008

The direction of measured face aftereffects

Christopher P. Benton; Emma C. Burgess

Prolonged viewing of a face can result in a change of our perception of subsequent faces. This process of adaptation is believed to be functional and to reflect optimization-driven changes in the neural encoding. Because it is believed to target the neural systems underlying face processing, the measurement of face aftereffects is seen as a powerful behavioral technique that can provide deep insights into our facial encoding. Face identity aftereffects have typically been measured by assessing the way in which adaptation changes the perception of images from a test sequence, the latter commonly derived from morphing between two base images. The current study asks to what extent such face aftereffects are driven by the test sequence used to measure them. Using subjects trained to respond either to identity of expression, we examined the effects of identity and expression adaptation on test stimuli that varied in both identity and expression. We found that face adaptation produced measured aftereffects that were congruent with the adaptation stimulus; the composition of the test sequences did not affect the measured direction of the face aftereffects. Our results support the view that face adaptation studies can meaningfully tap into the intrinsically multidimensional nature of our representation of facial identity.


Vision Research | 2006

Viewpoint dependence in adaptation to facial identity

Christopher P. Benton; Sarah J. Jennings; David J. Chatting

We produced morph sequences between identities at a variety of viewpoints, ranging from the three quarter leftward facing view, to the three quarter rightward facing view. We measured the strength of identity adaptation as a function of changing test viewpoint whilst keep the adaptation viewpoint constant, and as a function of adaptation viewpoint whilst keeping test viewpoint constant. Our results show a substantial decrease in adaptation as the angle between adaptation and test viewpoint increases. These findings persisted when we introduced controls for low-level retinotopic adaptation, leading us to conclude that our results show strong evidence for viewpoint dependence in the high-level encoding of facial identity. Our findings support models in which identity is encoded, to a large degree, by viewpoint dependent non-retinotopic neural mechanisms. Functional imaging studies suggest the fusiform gyrus as the most likely location for this mechanism.


Journal of Psychopharmacology | 2009

Effects of acute alcohol consumption on processing of perceptual cues of emotional expression

Angela S. Attwood; C Ohlson; Christopher P. Benton; Ian S. Penton-Voak; Marcus R. Munafò

Alcohol consumption has been associated with increases in aggressive behaviour. However, experimental evidence of a direct association is equivocal, and mechanisms that may underlie this relationship are poorly understood. One mechanism by which alcohol consumption may increase aggressive behaviour is via alterations in processing of emotional facial cues. We investigated the effects of acute alcohol consumption on sensitivity to facial expressions of emotion. Participants attended three experimental sessions where they consumed an alcoholic drink (0.0, 0.2 or 0.4 g/kg), and completed a psychophysical task to distinguish expressive from neutral faces. The level of emotion in the expressive face varied across trials the threshold at which the expressive face was reliably identified and measured. We observed a significant three-way interaction involving emotion, participant sex and alcohol dose. Male participants showed significantly higher perceptual thresholds for sad facial expressions compared with female participants following consumption of the highest dose of alcohol. Our data indicate sex differences in the processing of facial cues of emotional expression following alcohol consumption. There was no evidence that alcohol altered the processing of angry facial expressions. Future studies should examine effects of alcohol expectancy and investigate the effects of alcohol on the miscategorisation of emotional expressions.


Proceedings of the Royal Society of London B: Biological Sciences | 2007

Turning the other cheek: the viewpoint dependence of facial expression after-effects

Christopher P. Benton; Peter J. Etchells; Gillian Porter; Andrew P. Clark; Ian S. Penton-Voak; Stavri G. Nikolov

How do we visually encode facial expressions? Is this done by viewpoint-dependent mechanisms representing facial expressions as two-dimensional templates or do we build more complex viewpoint independent three-dimensional representations? Recent facial adaptation techniques offer a powerful way to address these questions. Prolonged viewing of a stimulus (adaptation) changes the perception of subsequently viewed stimuli (an after-effect). Adaptation to a particular attribute is believed to target those neural mechanisms encoding that attribute. We gathered images of facial expressions taken simultaneously from five different viewpoints evenly spread from the three-quarter leftward to the three-quarter rightward facing view. We measured the strength of expression after-effects as a function of the difference between adaptation and test viewpoints. Our data show that, although there is a decrease in after-effect over test viewpoint, there remains a substantial after-effect when adapt and test are at differing three-quarter views. We take these results to indicate that neural systems encoding facial expressions contain a mixture of viewpoint-dependent and viewpoint-independent elements. This accords with evidence from single cell recording studies in macaque and is consonant with a view in which viewpoint-independent expression encoding arises from a combination of view-dependent expression-sensitive responses.


Proceedings of the Royal Society of London B: Biological Sciences | 2001

A new approach to analysing texture-defined motion

Christopher P. Benton; Alan Johnston

It has been widely accepted that standard low-level computational approaches to motion processing cannot extract texture-defined motion without applying some pre-processing nonlinearity. This has motivated accounts of motion perception in which luminance- and texture-defined motion are processed by separate mechanisms. Here, we introduce a novel method of image description where motion sequences may be described in terms of their local spatial and temporal gradients. This allows us to assess the local velocity information available to standard low-level motion mechanisms. Our analysis of several texture - motion stimuli shows that the information indicating correct texture - motion velocity and/or direction is present in the raw luminance measures. This raises the possibility that luminance - motion and texture - motion may be processed by the same cortical mechanisms. Our analysis offers a way of looking at texture - motion processing that is, to our knowledge, new and original.


Proceedings of the Royal Society of London B: Biological Sciences | 1999

Induced motion at texture–defined motion boundaries

Alan Johnston; Christopher P. Benton; Peter W. McOwan

When a static textured background is covered and uncovered by a moving bar of the same mean luminance we can clearly see the motion of the bar. Texture‐defined motion provides an example of a naturally occurring second‐order motion. Second‐order motion sequences defeat standard spatio‐temporal energy models of motion perception. It has been proposed that second‐order stimuli are analysed by separate systems, operating in parallel with luminance‐defined motion processing, which incorporate identifiable pre‐processing stages that make second‐order patterns visible to standard techniques. However, the proposal of multiple paths to motion analysis remains controversial. Here we describe the behaviour of a model that recovers both luminance‐defined and an important class of texture‐defined motion. The model also accounts for the induced motion that is seen in some texture‐defined motion sequences. We measured the perceived direction and speed of both the contrast envelope and induced motion in the case of a contrast modulation of static noise textures. Significantly, the model predicts the perceived speed of the induced motion seen at second‐order texture boundaries. The induced motion investigated here appears distinct from classical induced effects resulting from motion contrast or the movement of a reference frame.


Psychopharmacology | 2009

Effects of alcohol consumption and alcohol expectancy on the categorisation of perceptual cues of emotional expression

Angela S. Attwood; Alia F. Ataya; Christopher P. Benton; Ian S. Penton-Voak; Marcus R. Munafò

RationaleEvidence that alcohol leads to increased aggressive behaviour is equivocal and confounded by evidence that such effects may operate indirectly via expectancy. One mechanism by which alcohol consumption may increase aggressive behaviour is via alterations in the processing of emotional facial cues.ObjectivesWe investigated whether acute alcohol consumption or the expectancy of consuming alcohol (or both) induces differences in the categorisation of ambiguous emotional expressions. We also explored differences between male and female participants, using male and female facial cues of emotional expression.MethodsFollowing consumption of a drink, participants completed a categorisation task in which they had to identify the emotional expression of a facial stimulus. Stimuli were morphed facial images ranging between unambiguously angry and happy expressions (condition 1) or between unambiguously angry and disgusted expressions (condition 2). Participants (N = 96) were randomised to receive an alcoholic or non-alcoholic drink and to be told that they would receive an alcoholic or non-alcoholic drink.ResultsSignificant effects of alcohol were obtained in the angry–disgusted task condition, but only when the target facial stimulus was male. Participants tended to categorise male disgusted faces as angry after alcohol, but not after placebo.ConclusionsOur data indicate that alcohol consumption may increase the likelihood of an ambiguous but negative facial expression being judged as angry. However, these effects were only observed for male faces and therefore may have been influenced by the greater expectation of aggression in males compared to females. Implications for alcohol-associated aggressive behaviour are discussed.

Collaboration


Dive into the Christopher P. Benton's collaboration.

Top Co-Authors

Avatar

William Curran

Queen's University Belfast

View shared research outputs
Top Co-Authors

Avatar

Alan Johnston

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar

Peter W. McOwan

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Colin W. G. Clifford

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge