Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Phil McAleer is active.

Publication


Featured researches published by Phil McAleer.


Vision Research | 2009

Vision in autism spectrum disorders

David R. Simmons; Ashley E. Robertson; Lawrie S. McKay; Erin Toal; Phil McAleer; Frank E. Pollick

Autism spectrum disorders (ASDs) are developmental disorders which are thought primarily to affect social functioning. However, there is now a growing body of evidence that unusual sensory processing is at least a concomitant and possibly the cause of many of the behavioural signs and symptoms of ASD. A comprehensive and critical review of the phenomenological, empirical, neuroscientific and theoretical literature pertaining to visual processing in ASD is presented, along with a brief justification of a new theory which may help to explain some of the data, and link it with other current hypotheses about the genetic and neural aetiologies of this enigmatic condition.


PLOS ONE | 2014

How Do You Say ‘Hello’? Personality Impressions from Brief Novel Voices

Phil McAleer; Alexander Todorov; Pascal Belin

On hearing a novel voice, listeners readily form personality impressions of that speaker. Accurate or not, these impressions are known to affect subsequent interactions; yet the underlying psychological and acoustical bases remain poorly understood. Furthermore, hitherto studies have focussed on extended speech as opposed to analysing the instantaneous impressions we obtain from first experience. In this paper, through a mass online rating experiment, 320 participants rated 64 sub-second vocal utterances of the word ‘hello’ on one of 10 personality traits. We show that: (1) personality judgements of brief utterances from unfamiliar speakers are consistent across listeners; (2) a two-dimensional ‘social voice space’ with axes mapping Valence (Trust, Likeability) and Dominance, each driven by differing combinations of vocal acoustics, adequately summarises ratings in both male and female voices; and (3) a positive combination of Valence and Dominance results in increased perceived male vocal Attractiveness, whereas perceived female vocal Attractiveness is largely controlled by increasing Valence. Results are discussed in relation to the rapid evaluation of personality and, in turn, the intent of others, as being driven by survival mechanisms via approach or avoidance behaviours. These findings provide empirical bases for predicting personality impressions from acoustical analyses of short utterances and for generating desired personality impressions in artificial voices.


NeuroImage | 2012

Do distinct atypical cortical networks process biological motion information in adults with autism spectrum disorders.

Lawrie S. McKay; David R. Simmons; Phil McAleer; Dominic Marjoram; Judith Piggot; Frank E. Pollick

Whether people with Autism Spectrum Disorders (ASDs) have a specific deficit when processing biological motion has been a topic of much debate. We used psychophysical methods to determine individual behavioural thresholds in a point-light direction discrimination paradigm for a small but carefully matched groups of adults (N=10 per group) with and without ASDs. These thresholds were used to derive individual stimulus levels in an identical fMRI task, with the purpose of equalising task performance across all participants whilst inside the scanner. The results of this investigation show that despite comparable behavioural performance both inside and outside the scanner, the group with ASDs shows a different pattern of BOLD activation from the TD group in response to the same stimulus levels. Furthermore, connectivity analysis suggests that the main differences between the groups are that the TD group utilise a unitary network with information passing from temporal to parietal regions, whilst the ASD group utilise two distinct networks; one utilising motion sensitive areas and another utilising form selective areas. Furthermore, a temporal-parietal link that is present in the TD group is missing in the ASD group. We tentatively propose that these differences may occur due to early dysfunctional connectivity in the brains of people with ASDs, which to some extent is compensated for by rewiring in high functioning adults.


Proceedings of the National Academy of Sciences of the United States of America | 2013

Distinct patterns of functional brain connectivity correlate with objective performance and subjective beliefs

Pablo Barttfeld; Bruno Wicker; Phil McAleer; Pascal Belin; Yann Corentin Cojan; Martín Graziano; Ramón Carlos Leiguarda; Mariano Sigman

The degree of correspondence between objective performance and subjective beliefs varies widely across individuals. Here we demonstrate that functional brain network connectivity measured before exposure to a perceptual decision task covaries with individual objective (type-I performance) and subjective (type-II performance) accuracy. Increases in connectivity with type-II performance were observed in networks measured while participants directed attention inward (focus on respiration), but not in networks measured during states of neutral (resting state) or exogenous attention. Measures of type-I performance were less sensitive to the subjects’ specific attentional states from which the networks were derived. These results suggest the existence of functional brain networks indexing objective performance and accuracy of subjective beliefs distinctively expressed in a set of stable mental states.


Behavior Research Methods | 2008

Understanding intention from minimal displays of human activity

Phil McAleer; Frank E. Pollick

The impression of animacy from the motion of simple shapes typically relies on synthetically defined motion patterns resulting in pseudorepresentations of human movement. Thus, it is unclear how these synthetic motions relate to actual biological agents. To clarify this relationship, we introduce a novel approach that uses video processing to reduce full-video displays of human interactions to animacy displays, thus creating animate shapes whose motions are directly derived from human actions. Furthermore, this technique facilitates the comparison of interactions in animacy displays from different viewpoints—an area that has yet to be researched. We introduce two experiments in which animacy displays were created showing six dyadic interactions from two viewpoints, incorporating cues altering the quantity of the visual information available. With a six-alternative forced choice task, results indicate that animacy displays can be created via this naturalistic technique and reveal a previously unreported advantage for viewing intentional motion from an overhead viewpoint.


Journal of Autism and Developmental Disorders | 2011

Intention Perception in High Functioning People with Autism Spectrum Disorders Using Animacy Displays Derived from Human Actions

Phil McAleer; Jim Kay; Frank E. Pollick; M. D. Rutherford

The perception of intent in Autism Spectrum Disorders (ASD) often relies on synthetic animacy displays. This study tests intention perception in ASD via animacy stimuli derived from human motion. Using a forced choice task, 28 participants (14 ASDs; 14 age and verbal-I.Q. matched controls) categorized displays of Chasing, Fighting, Flirting, Following, Guarding and Playing, from two viewpoints (side, overhead) in both animacy and full video displays. Detailed analysis revealed no differences between populations in accuracy, or response patterns. Collapsing across groups revealed Following and Video displays to be most accurately perceived. The stimuli and intentions used are compared to those of previous studies, and the implication of our results on the understanding of Theory of Mind in ASD is discussed.


Cognitive, Affective, & Behavioral Neuroscience | 2014

The role of kinematics in cortical regions for continuous human motion perception

Phil McAleer; Frank E. Pollick; Scott A. Love; Frances Crabbe; Jeffrey M. Zacks

It has been proposed that we make sense of the movements of others by observing fluctuations in the kinematic properties of their actions. At the neural level, activity in the human motion complex (hMT+) and posterior superior temporal sulcus (pSTS) has been implicated in this relationship. However, previous neuroimaging studies have largely utilized brief, diminished stimuli, and the role of relevant kinematic parameters for the processing of human action remains unclear. We addressed this issue by showing extended-duration natural displays of an actor engaged in two common activities, to 12 participants in an fMRI study under passive viewing conditions. Our region-of-interest analysis focused on three neural areas (hMT+, pSTS, and fusiform face area) and was accompanied by a whole-brain analysis. The kinematic properties of the actor, particularly the speed of body part motion and the distance between body parts, were related to activity in hMT+ and pSTS. Whole-brain exploratory analyses revealed additional areas in posterior cortex, frontal cortex, and the cerebellum whose activity was related to these features. These results indicate that the kinematic properties of peoples’ movements are continually monitored during everyday activity as a step to determining actions and intent.


Vision Research | 2009

Contribution of configural information in a direction discrimination task: Evidence using a novel masking paradigm

Lawrie S. McKay; David R. Simmons; Phil McAleer; Frank E. Pollick

Understanding how structure and motion information contribute to the perception of biological motion is often studied with masking techniques. Current techniques in masking point-light walkers typically rely on adding surrounding masking dots or altering phase relations between joints. Here, we demonstrate the use of novel stimuli that make it possible to determine the noise level at which the local motion cues mask the opposing configural cues without changing the number of overall points in the display. Results show improved direction discrimination when configural cues are present compared to when the identical local motion signals are present but lack configural information.


Art & Perception | 2014

Event segmentation and biological motion perception in watching dance

Katie Noble; Donald Glowinski; Helen Murphy; Corinne Jola; Phil McAleer; Nikhil Darshane; Kedzie Penfield; Sandhiya Kalyanasundaram; Antonio Camurri; Frank E. Pollick

We used a combination of behavioral, computational vision and fMRI methods to examine human brain activity while viewing a 386 s video of a solo Bharatanatyam dance. A computational analysis provided us with a Motion Index (MI) quantifying the silhouette motion of the dancer throughout the dance. A behavioral analysis using 30 naive observers provided us with the time points where observers were most likely to report event boundaries where one movement segment ended and another began. These behavioral and computational data were used to interpret the brain activity of a different set of 11 naive observers who viewed the dance video while brain activity was measured using fMRI. Results showed that the Motion Index related to brain activity in a single cluster in the right Inferior Temporal Gyrus (ITG) in the vicinity of the Extrastriate Body Area (EBA). Perception of event boundaries in the video was related to the BA44 region of right Inferior Frontal Gyrus as well as extensive clusters of bilateral activity in the Inferior Occipital Gyrus which extended in the right hemisphere towards the posterior Superior Temporal Sulcus (pSTS).


Journal of Vision | 2004

Perceiving Animacy and Arousal in Transformed Displays of Human Interaction

Phil McAleer; Barbara Mazzarino; Gaultiero Volpe; Antonio Camurri; Helena Patterson; Frank E. Pollick

When viewing a moving abstract stimulus, people tend to attribute social meaning and purpose to the movement. The classic work of Heider and Simmel [1] investigated how observers would describe movement of simple geometric shapes (circle, triangles, and a square) around a screen. A high proportion of participants reported seeing some form of purposeful interaction between the three abstract objects and defining this interaction as a social encounter. Various papers have subsequently found similar results [2,3] and gone on to show that, as Heider and Simmel suggested, the phenomenon was due more to the relationship in space and time of the objects, rather than any particular object characteristic. The research of Tremoulet and Feldman [4] has shown that the percept of animacy may be elicited with a solitary moving object. They asked observers to rate the movement of a single dot or rectangle for whether it was under the influence of an external force, or whether it was in control of its own motion. At mid-trajectory the shape would change speed or direction, or both. They found that shapes that either changed direction greater than 25 degrees from the original trajectory, or changed speed, were judged to be “more alive” than others. Further discussion and evidence of animacy with one or two small dots can be found in Gelman, Durgin and Kaufman [5] Our aim was to further study this phenomenon by using a different method of stimulus production. Previous methods for producing displays of animate objects have relied either on handcrafted stimuli or on parametric variations of simple motion patterns. It is our aim to work towards a new automatic approach by taking actual human movements, transforming them into basic shapes, and exploring what motion properties need to be preserved to obtain animacy. Though the phenomenon of animacy has been shown for many years, using various different displays, very few specific criteria have been set on the essential characteristics of the displays. Part of this research is to try and establish what movements result in percepts of animacy, and in turn, to give further understanding of essential characteristics of human movement and social interaction. In this paper we discuss two experiments in which we examine how different transformations of an original video of a dance influences perception of animacy. We also examine reports of arousal, Experiment 1, and emotional engagement in Experiment 2.

Collaboration


Dive into the Phil McAleer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Karin Petrini

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pascal Belin

Université de Montréal

View shared research outputs
Top Co-Authors

Avatar

Carl Haakon Waadeland

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge