Michael Ingleby
University of Huddersfield
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael Ingleby.
tests and proofs | 2009
Masood Mehmood Khan; Robert D. Ward; Michael Ingleby
Earlier researchers were able to extract the transient facial thermal features from thermal infrared images (TIRIs) to make binary distinctions between the expressions of affective states. However, effective human-computer interaction would require machines to distinguish between the subtle facial expressions of affective states. This work, for the first time, attempts to use the transient facial thermal features for recognizing a much wider range of facial expressions. A database of 324 time-sequential, visible-spectrum, and thermal facial images was developed representing different facial expressions from 23 participants in different situations. A novel facial thermal feature extraction, selection, and classification approach was developed and invoked on various Gaussian mixture models constructed using: neutral and pretended happy and sad faces, faces with multiple positive and negative facial expressions, faces with neutral and six (pretended) basic facial expressions, and faces with evoked happiness, sadness, disgust, and anger. This work demonstrates that (1) infrared imaging can be used to observe the affective-state-specific facial thermal variations, (2) pixel-grey level analysis of TIRIs can help localise significant facial thermal feature points along the major facial muscles, and (3) cluster-analytic classification of transient thermal features can help distinguish between the facial expressions of affective states in an optimized eigenspace of input thermal feature vectors. The observed classification results exhibited influence of a Gaussian mixture models structure on classifier-performance. The work also unveiled some pertinent aspects of future research on the use of facial thermal features in automated facial expression classification and affect recognition.
robotics, automation and mechatronics | 2006
Masood Mehmood Khan; Robert D. Ward; Michael Ingleby
Useful information about affects and affective states can be extracted form the physiological signals, even under difficult lighting and pose conditions. Little work has been done on using physiological signals in automated affect recognition systems. We employed measurements of facial skin temperature variations for developing a non-intrusive automated facial expression classification system. Variances in thermal intensity values recorded at thermally significant locations on human faces were used to discern between normal, pretended happy and pretended sad facial expression of affective states. A three-step algorithmic approach was used to construct the classifier. Employed approach involved derivation of principal components, stepwise selection of optimal and most discriminating features, and linear discriminant analysis within the reduced optimal eigenspace. The resulting classifier performed at an impressively low (16.2%) error rate
ieee toronto international conference science and technology for humanity | 2009
Farbod Hosseyndoost Foomany; Alex Hirschfield; Michael Ingleby
Research results have shown that spoof attacks pose severe security threats to biometric verification systems. Nevertheless the literature lacks a comprehensive, flexible and dynamic framework for security analysis of biometric and especially voice based verification systems when spoof attacks are taken into picture. This paper aims at highlighting the vulnerabilities, classifying the threats and clarifying requirements for such a dynamic framework which transfers the vulnerability evaluation results into legacy knowledge in the security domain. The proposed design and architecture facilitates comparison of two systems as well as definition of the threats in an independent and isolated way.
IEEE Transactions on Affective Computing | 2017
Masood Mehmood Khan; Robert D. Ward; Michael Ingleby
Automated assessment of affect and arousal level can help psychologists and psychiatrists in clinical diagnoses; and may enable affect-aware robot-human interaction. This work identifies major difficulties in automating affect and arousal assessment and attempts to overcome some of them. We first analyze thermal infrared images and examine how changes in affect and/or arousal level would cause hæmodynamic variations, concentrated along certain facial muscles. These concentrations are used to measure affect/arousal induced facial thermal variations. In step-1 of a 2-step pattern recognition schema, ‘between-affect’ and ‘between-arousal-level’ variations are used to derive facial thermal features as Principal Components (PCs) of the facial thermal measurements. The most influential of these PCs are used to cluster the feature space for different affects and subsequently assign a set of thermal features to an affect cluster. In step-2, affect clusters are partitioned into high, medium and mild arousal levels. The distance between a test face vector and the centroids of sub-clusters at three arousal levels belonging to a single affective state, identified from step-1, is used to determine the arousal level of the identified affective state.
computer analysis of images and patterns | 2009
Masood Mehmood Khan; Robert D. Ward; Michael Ingleby
The ability to distinguish feigned from involuntary expressions of emotions could help in the investigation and treatment of neuropsychiatric and affective disorders and in the detection of malingering. This work investigates differences in emotion-specific patterns of thermal variations along the major facial muscles. Using experimental data extracted from 156 images, we attempted to classify patterns of emotion-specific thermal variations into neutral, and voluntary and involuntary expressions of positive and negative emotive states. Initial results suggest (i) each facial muscle exhibits a unique thermal response to various emotive states; (ii) the pattern of thermal variances along the facial muscles may assist in classifying voluntary and involuntary facial expressions; and (iii) facial skin temperature measurements along the major facial muscles may be used in automated emotion assessment.
meeting of the association for computational linguistics | 1998
Michael Ingleby; Wiebke Brockhaus
We demonstrate the feasibility of using unary primes in speech-driven language processing. Proponents of Government Phonology (one of several phonological frameworks in which speech segments are represented as combinations of relatively few subsegmental primes) claim that primes are acoustically realisable. This claim is examined critically searching out signatures for primes in multispeaker speech signal data. In response to a wide variation in the ease of detection of primes, it is proposed that the computational approach to phonology-based, speech-driven software should be organised in stages. After each stage, computational processes like segmentation and lexical access can be launched to run concurrently with later stages of prime detection.
ACM Transactions on Autonomous and Adaptive Systems | 2006
Masood Mehmood Khan; Michael Ingleby; Robert D. Ward
Measurement | 2006
Hartmut Kieckhoefer; Michael Ingleby; Gary Lucas
Proceedings of the Annual Meeting of the Cognitive Science Society | 2005
Azra N. Ali; Michael Ingleby
AVSP | 2005
Azra N. Ali; Ashraf Hassan-Haj; Michael Ingleby; Ali Idrissi