Alwyn Goh
Universiti Sains Malaysia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alwyn Goh.
IEEE Transactions on Pattern Analysis and Machine Intelligence | 2006
Andrew Beng Jin Teoh; Alwyn Goh; David Chek Ling Ngo
Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the random multispace quantization (RMQ) of biometric and external random inputs
international conference on communications | 2003
Alwyn Goh; David Chek Ling Ngo
We outline cryptographic key-computation from biometric data based on error-tolerant transformation of continuous-valued face eigenprojections to zero-error bitstrings suitable for cryptographic applicability. Bio-hashing is based on iterated inner-products between pseudorandom and user-specific eigenprojections, each of which extracts a single-bit from the face data. This discretisation is highly tolerant of data capture offsets, with same-user face data resulting in highly correlated bitstrings. The resultant user identification in terms of a small bitstring-set is then securely reduced to a single cryptographic key via Shamir secret-sharing. Generation of the pseudorandom eigenprojection sequence can be securely parameterised via incorporation of physical tokens. Tokenised bio-hashing is rigorously protective of the face data, with security comparable to cryptographic hashing of token and knowledge key-factors. Our methodology has several major advantages over conventional biometric analysis ie elimination of false accepts (FA) without unacceptable compromise in terms of more probable false rejects (FR), straightforward key-management, and cryptographically rigorous commitment of biometric data in conjunction with verification thereof.
Computers & Security | 2004
Andrew Beng Jin Teoh; David Chek Ling Ngo; Alwyn Goh
Among the various computer security techniques practice today, cryptography has been identified as one of the most important solutions in the integrated digital security system. Cryptographic techniques such as encryption can provide very long passwords that are not required to be remembered but are in turn protected by simple password, hence defecting their purpose. In this paper, we proposed a novel two-stage technique to generate personalized cryptographic keys from the face biometric, which offers the inextricably link to its owner. At the first stage, integral transform of biometric input is to discretise to produce a set of bit representation with a set of tokenised pseudo random number, coined as FaceHash. In the second stage, FaceHash is then securely reduced to a single cryptographic key via Shamir secret-sharing. Tokenised FaceHashing is rigorously protective of the face data, with security comparable to cryptographic hashing of token and knowledge key-factor. The key is constructed to resist cryptanalysis even against an adversary who captures the user device or the feature descriptor.
IEEE Transactions on Circuits and Systems for Video Technology | 2006
David Chek Ling Ngo; Andrew Beng Jin Teoh; Alwyn Goh
In this paper, we describe a biometric hash algorithm for robust extraction of bits from face images. While a face-recognition system has high acceptability, its accuracy is low. The problem arises because of insufficient capability of representing features and variations in data. Thus, we use dimensionality reduction to improve the capability to represent features, error correction to improve robustness with respect to within-class variations, and random projection and orthogonalization to improve discrimination among classes. Specifically, we describe several dimensionality-reduction techniques with biometric hashing enhancement for various numbers of bits extracted. The theoretical results are evaluated on the FERET face database showing that the enhanced methods significantly outperform the corresponding raw methods when the number of extracted bits reaches 100. The improvements of the postprocessing stage for principal component analysis (PCA), Wavelet Transform with PCA, Fisher linear discriminant, Wavelet Transform, and Wavelet Transform with Fourier-Mellin Transform are 98.02%, 95.83%, 99.46%, 99.16%, and 100%, respectively. The proposed technique is quite general, and can be applied to other biometric templates. We anticipate that this algorithm will find applications in cryptographically secure biometric authentication schemes.
intelligent data analysis | 2001
Syed Sibte Raza Abidi; Kok Meng Hoe; Alwyn Goh
We present a strategy, together with its computational implementation, to intelligently analyze the internal structure of inductively-derived data clusters in terms of symbolic cluster-defining rules. We present a symbolic rule extraction workbench that leverages rough sets theory to inductively extract CNF form symbolic rules from unannotated continuous-valued data-vectors. Our workbench purports a hybrid rule extraction methodology, incorporating a sequence of methods to achieve data clustering, data discretization and eventually symbolic rule discovery via rough sets approximation. The featured symbolic rule extraction workbench will be tested and analyzed using biomedical datasets.
Lecture Notes in Computer Science | 2004
David Chek Ling Ngo; Andrew Beng Jin Teoh; Alwyn Goh
We present a novel approach to generating cryptographic keys from biometrics. In our approach, the PCA coefficients of a face image are discretised using a bit-extraction method to n bits. We compare performance results obtained with and without the discretisation procedure applied to several PCA-based methods (including PCA, PCA with weighing coefficients, PCA on Wavelet Subband, and LDA) on a combined face image database. Results show that the discretisation step consistently increases the performance.
Lecture Notes in Computer Science | 2004
Andrew Beng Jin Teoh; David Chek Ling Ngo; Alwyn Goh
This paper proposed a novel integrated dual factor authenticator based on iterated inner products between tokenised pseudo random number and the user specific facial feature, which generated from a well known subspace feature extraction technique- Fisher Discriminant Analysis, and hence produce a set of user specific compact code that coined as BioCode. The BioCode highly tolerant of data captures offsets, with same user facial data resulting in highly correlated bitstrings. Moreover, there is no deterministic way to get the user specific code without having both tokenised random data and user facial feature. This would protect us for instance against biometric fabrication by changing the user specific credential, is as simple as changing the token containing the random data. This approach has significant functional advantages over solely biometrics ie. zero EER point and clean separation of the genuine and imposter populations, thereby allowing elimination of FARs without suffering from increased occurrence of FRRs.
pacific rim international conference on artificial intelligence | 1998
Syed Sibte Raza Abidi; Alwyn Goh
Predictive modelling, in a knowledge discovery context, is regarded as the problem of deriving predictive knowledge from historical/temporal data. Here we argue that neural networks, an established computational technology, can efficaciously be used to perform predictive modelling, i.e. to explore the intrinsic dynamics of temporal data. Infectious-disease epidemic risk management is a candidate area for exploiting the potential of neural network based predictive modelling-the idea is to model time series derived from bacteria-antibiotic sensitivity and resistivity patterns as it is believed that bacterial sensitivity and resistivity to any antibiotic tends to undergo temporal fluctuations. The objective of epidemic risk management is to obtain forecasted values for the bacteria-antibiotic sensitivity and resistivity profiles, which could then be used to guide physicians with regards to the choice of the most effective antibiotic to treat a particular bacterial infection. In this regard, we present a Web-based Infectious Disease Cycle Forecaster (IDCF), comprising a number of distinct neural networks, that have been trained on data obtained from long-term clinical observation of 89 types of bacterial infections, being treated using 36 different antibiotics. Preliminary results indicate that IDCF is capable of generating highly accurate forecasts given sufficient past data on bacteria-antibiotic interaction. IDCF features a client-server based WWW interface that allows for remote projections to be requested for and displayed over the Internet.
international conference on information security | 2002
Wai Han Soo; Azman Samsudin; Alwyn Goh
The presumption of player distrust and untrustworthiness in mental card gaming results in the formulation of complex and compute-intensive protocols, particularly for shuffling. We present a robust, verifiable and efficient card shuffling protocol based on an optimisation of Chang-Melham arbitrary-sized (AS) Benes permutation network (PN), which can flexibly accommodates variable pack sizes, achieving optimal shuffling performance. We also outline the use of these PNs in a distributed (among ? players) construction, which combines the best attributes of Abe and Jakobsson-Juels mix-net formalisms. Card shuffling can therefore be executed on a structurally simple mix-net - with only t + 1 PNs required for operational robustness against collusion by t cheating players, and efficient zero knowledge proofs (ZKP) to verify correct shuffling by each player. Shuffling efficiency is also enhanced by our limited application of verifiable secret sharing (VSS) on the ElGamal keys. The resultant protocol achieves an asymptotic complexity of O(tN lg N) for N inputs; which is comparable or superior to previous schemes.
international conference on advanced learning technologies | 2001
Syed Sibte Raza Abidi; Alwyn Goh
We present a technology-enriched, Web-enabled, value-added distance exam preparation and evaluation service that provides: (a) offline execution of fully featured preparatory exercises and evaluation tests in a real-life simulated examination environment; (b) content personalization to address scholastic weakness; and (c) the use of data mining techniques to ensure content effectiveness and the pro-active identification of the academic needs of various student segments. The solution is designed as a client-server architecture featuring Java technology and XML-mediated information exchange over the Internet.