Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Syed Zulqarnain Gilani is active.

Publication


Featured researches published by Syed Zulqarnain Gilani.


Proceedings of the Royal Society B: Biological Sciences | 2015

Prenatal testosterone exposure is related to sexually dimorphic facial morphology in adulthood

Andrew J. O. Whitehouse; Syed Zulqarnain Gilani; Faisal Shafait; Ajmal S. Mian; Diana Weiting Tan; Murray T. Maybery; Jeffrey A. Keelan; Roger Hart; David J. Handelsman; Mithran Goonawardene; Peter R. Eastwood

Prenatal testosterone may have a powerful masculinizing effect on postnatal physical characteristics. However, no study has directly tested this hypothesis. Here, we report a 20-year follow-up study that measured testosterone concentrations from the umbilical cord blood of 97 male and 86 female newborns, and procured three-dimensional facial images on these participants in adulthood (range: 21–24 years). Twenty-three Euclidean and geodesic distances were measured from the facial images and an algorithm identified a set of six distances that most effectively distinguished adult males from females. From these distances, a ‘gender score’ was calculated for each face, indicating the degree of masculinity or femininity. Higher cord testosterone levels were associated with masculinized facial features when males and females were analysed together (n = 183; r = −0.59), as well as when males (n = 86; r = −0.55) and females (n = 97; r = −0.48) were examined separately (p-values < 0.001). The relationships remained significant and substantial after adjusting for potentially confounding variables. Adult circulating testosterone concentrations were available for males but showed no statistically significant relationship with gendered facial morphology (n = 85, r = 0.01, p = 0.93). This study provides the first direct evidence of a link between prenatal testosterone exposure and human facial structure.


computer vision and pattern recognition | 2015

Shape-based automatic detection of a large number of 3D facial landmarks

Syed Zulqarnain Gilani; Faisal Shafait; Ajmal S. Mian

We present an algorithm for automatic detection of a large number of anthropometric landmarks on 3D faces. Our approach does not use texture and is completely shape based in order to detect landmarks that are morphologically significant. The proposed algorithm evolves level set curves with adaptive geometric speed functions to automatically extract effective seed points for dense correspondence. Correspondences are established by minimizing the bending energy between patches around seed points of given faces to those of a reference face. Given its hierarchical structure, our algorithm is capable of establishing thousands of correspondences between a large number of faces. Finally, a morphable model based on the dense corresponding points is fitted to an unseen query face for transfer of correspondences and hence automatic detection of landmarks. The proposed algorithm can detect any number of pre-defined landmarks including subtle landmarks that are even difficult to detect manually. Extensive experimental comparison on two benchmark databases containing 6, 507 scans shows that our algorithm outperforms six state of the art algorithms.


digital image computing techniques and applications | 2013

Biologically Significant Facial Landmarks: How Significant Are They for Gender Classification?

Syed Zulqarnain Gilani; Faisal Shafait; Ajmal S. Mian

Automatic gender classification has many applications in human computer interaction. However, to determine the gender of an unseen face is challenging because of the diversity and variations in the human face. In this paper, we explore the importance of biologically significant facial landmarks for gender classification and propose a fully automatic gender classification algorithm. We extract 3D Euclidean and Geodesic distances between these landmarks and use feature selection to determine the relative importance of the biological landmarks for classifying gender. Unlike existing techniques, our algorithm is fully automatic since all landmarks are automatically detected. Experiments on one of the largest 3D face databases FRGC v2 show that our algorithm outperforms all existing techniques by a significant margin.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2018

Dense 3D Face Correspondence

Syed Zulqarnain Gilani; Ajmal S. Mian; Faisal Shafait; Ian D. Reid

We present an algorithm that automatically establishes dense correspondences between a large number of 3D faces. Starting from automatically detected sparse correspondences on the outer boundary of 3D faces, the algorithm triangulates existing correspondences and expands them iteratively by matching points of distinctive surface curvature along the triangle edges. After exhausting keypoint matches, further correspondences are established by generating evenly distributed points within triangles by evolving level set geodesic curves from the centroids of large triangles. A deformable model (K3DM) is constructed from the dense corresponded faces and an algorithm is proposed for morphing the K3DM to fit unseen faces. This algorithm iterates between rigid alignment of an unseen face followed by regularized morphing of the deformable model. We have extensively evaluated the proposed algorithms on synthetic data and real 3D faces from the FRGCv2, Bosphorus, BU3DFE and UND Ear databases using quantitative and qualitative benchmarks. Our algorithm achieved dense correspondences with a mean localisation error of 1.28 mm on synthetic faces and detected 14 anthropometric landmarks on unseen real faces from the FRGCv2 database with 3 mm precision. Furthermore, our deformable model fitting algorithm achieved 98.5 percent face recognition accuracy on the FRGCv2 and 98.6 percent on Bosphorus database. Our dense model is also able to generalize to unseen datasets.


Pattern Recognition | 2017

Deep, Dense and Accurate 3D Face Correspondence for Generating Population Specific Deformable Models

Syed Zulqarnain Gilani; Ajmal S. Mian; Peter R. Eastwood

Abstract We present a multilinear algorithm to automatically establish dense point-to-point correspondence over an arbitrarily large number of population specific 3D faces across identities, facial expressions and poses. The algorithm is initialized with a subset of anthropometric landmarks detected by our proposed Deep Landmark Identification Network which is trained on synthetic images. The landmarks are used to segment the 3D face into Voronoi regions by evolving geodesic level set curves. Exploiting the intrinsic features of these regions, we extract discriminative keypoints on the facial manifold to elastically match the regions across faces for establishing dense correspondence. Finally, we generate a Region based 3D Deformable Model which is fitted to unseen faces to transfer the correspondences. We evaluate our algorithm on the tasks of facial landmark detection and recognition using two benchmark datasets. Comparison with thirteen state-of-the-art techniques shows the efficacy of our algorithm.


international conference on pattern recognition | 2014

Perceptual Differences between Men and Women: A 3D Facial Morphometric Perspective

Syed Zulqarnain Gilani; Ajmal S. Mian

Understanding the features employed by the human visual system in gender classification is considered a critical step towards improving machine based gender classification systems. We propose the use of 3D Euclidean and geodesic distances between biologically significant facial landmarks to classify gender. We perform five different experiments on the BU-3DFE face database to look for more representative features that can replicate our visual system. Based on our experiments we suggest that the human visual system looks at the ratio of 3D Euclidean and geodesic distance as these features can classify facial gender with an accuracy of 99.32%. The features selected by our proposed gender classification experiment are robust to ethnicity and moderate changes in expression. They also replicate the perceptual gender bias towards certain features and hence become good candidates for being a more representative feature set.


Journal of Neurodevelopmental Disorders | 2015

Sexually dimorphic facial features vary according to level of autistic-like traits in the general population.

Syed Zulqarnain Gilani; Diana Weiting Tan; Suzanna N. Russell-Smith; Murray T. Maybery; Ajmal S. Mian; Peter R. Eastwood; Faisal Shafait; Mithran Goonewardene; Andrew J. O. Whitehouse

BackgroundIn a recent study, Bejerot et al. observed that several physical features (including faces) of individuals with an autism spectrum disorder (ASD) were more androgynous than those of their typically developed counterparts, suggesting that ASD may be understood as a ‘gender defiant’ disorder. These findings are difficult to reconcile with the hypermasculinisation account, which proposes that ASD may be an exaggerated form of cognitive and biological masculinity. The current study extended these data by first identifying six facial features that best distinguished males and females from the general population and then examining these features in typically developing groups selected for high and low levels of autistic-like traits.MethodsIn study 1, three-dimensional (3D) facial images were collected from 208 young adult males and females recruited from the general population. Twenty-three facial distances were measured from these images and a gender classification and scoring algorithm was employed to identify a set of six facial features that most effectively distinguished male from female faces. In study 2, measurements of these six features were compared for groups of young adults selected for high (n = 46) or low (n = 66) levels of autistic-like traits.ResultsFor each sex, four of the six sexually dimorphic facial distances significantly differentiated participants with high levels of autistic-like traits from those with low trait levels. All four features were less masculinised for high-trait males compared to low-trait males. Three of four features were less feminised for high-trait females compared to low-trait females. One feature was, however, not consistent with the general pattern of findings and was more feminised among females who reported more autistic-like traits. Based on the four significantly different facial distances for each sex, discriminant function analysis correctly classified 89.7% of the males and 88.9% of the females into their respective high- and low-trait groups.ConclusionsThe current data provide support for Bejerot et al.’s androgyny account since males and females with high levels of autistic-like traits generally showed less sex-typical facial features than individuals with low levels of autistic-like traits.


Scientific Reports | 2017

Hypermasculinised facial morphology in boys and girls with Autism Spectrum Disorder and its association with symptomatology

Diana Weiting Tan; Syed Zulqarnain Gilani; Murray T. Maybery; Ajmal S. Mian; Anna Hunt; Mark Walters; Andrew J. O. Whitehouse

Elevated prenatal testosterone exposure has been associated with Autism Spectrum Disorder (ASD) and facial masculinity. By employing three-dimensional (3D) photogrammetry, the current study investigated whether prepubescent boys and girls with ASD present increased facial masculinity compared to typically-developing controls. There were two phases to this research. 3D facial images were obtained from a normative sample of 48 boys and 53 girls (3.01–12.44 years old) to determine typical facial masculinity/femininity. The sexually dimorphic features were used to create a continuous ‘gender score’, indexing degree of facial masculinity. Gender scores based on 3D facial images were then compared for 54 autistic and 54 control boys (3.01–12.52 years old), and also for 20 autistic and 60 control girls (4.24–11.78 years). For each sex, increased facial masculinity was observed in the ASD group relative to control group. Further analyses revealed that increased facial masculinity in the ASD group correlated with more social-communication difficulties based on the Social Affect score derived from the Autism Diagnostic Observation Scale-Generic (ADOS-G). There was no association between facial masculinity and the derived Restricted and Repetitive Behaviours score. This is the first study demonstrating facial hypermasculinisation in ASD and its relationship to social-communication difficulties in prepubescent children.


digital image computing techniques and applications | 2016

Towards Large-Scale 3D Face Recognition

Syed Zulqarnain Gilani; Ajmal S. Mian

3D face recognition holds great promise in achieving robustness to pose, expressions and occlusions. However, 3D face recognition algorithms are still far behind their 2D counterparts due to the lack of large-scale datasets. We present a model based algorithm for 3D face recognition and test its performance by combining two large public datasets of 3D faces. We propose a Fully Convolutional Deep Network (FCDN) to initialize our algorithm. Reliable seed points are then extracted from each 3D face by evolving level set curves with a single curvature dependent adaptive speed function. We then establish dense correspondence between the faces in the training set by matching the surface around the seed points on a template face to the ones on the target faces. A morphable model is then fitted to probe faces and face recognition is performed by matching the parameters of the probe and gallery faces. Our algorithm achieves state of the art landmark localization results. Face recognition results on the combined FRGCv2 and Bosphorus datasets show that our method is effective in recognizing query faces with real world variations in pose and expression, and with occlusion and missing data despite a huge gallery. Comparing results of individual and combined datasets show that the recognition accuracy drops when the size of the gallery increases.


workshop on applications of computer vision | 2014

Gradient based efficient feature selection

Syed Zulqarnain Gilani; Faisal Shafait; Ajmal S. Mian

Selecting a reduced set of relevant and non-redundant features for supervised classification problems is a challenging task. We propose a gradient based feature selection method which can search the feature space efficiently and select a reduced set of representative features. We test our proposed algorithm on five small and medium sized pattern classification datasets as well as two large 3D face datasets for computer vision applications. Comparison with the state of the art wrapper and filter methods shows that our proposed technique yields better classification results in lesser number of evaluations of the target classifier. The feature subset selected by our algorithm is representative of the classes in the data and has the least variation in classification accuracy.

Collaboration


Dive into the Syed Zulqarnain Gilani's collaboration.

Top Co-Authors

Avatar

Ajmal S. Mian

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Faisal Shafait

National University of Sciences and Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew J. O. Whitehouse

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Diana Weiting Tan

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Murray T. Maybery

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Peter R. Eastwood

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Naveed Iqbal Rao

National University of Sciences and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna Hunt

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge