Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where B. V. K. Vijaya Kumar is active.

Publication


Featured researches published by B. V. K. Vijaya Kumar.


Applied Optics | 1987

Minimum average correlation energy filters

Abhijit Mahalanobis; B. V. K. Vijaya Kumar; David Casasent

The synthesis of a new category of spatial filters that produces sharp output correlation peaks with controlled peak values is considered. The sharp nature of the correlation peak is the major feature emphasized, since it facilitates target detection. Since these filters minimize the average correlation plane energy as the first step in filter synthesis, we refer to them as minimum average correlation energy filters. Experimental laboratory results from optical implementation of the filters are also presented and discussed.


Applied Optics | 1990

Performance measures for correlation filters

B. V. K. Vijaya Kumar; Laurence G. Hassebrook

Several performance criteria are described to enable a fair comparison among the various correlation filter designs: signal-to-noise ratio, peak sharpness, peak location, light efficiency, discriminability, and distortion invariance. The trade-offs resulting between some of these criteria are illustrated with the help of a new family of filters called fractional power filters (FPFs). The classical matched filter, phase-only filter (POF), and inverse filter are special cases of FPFs. Using examples, we show that the POF appears to provide a good compromise between noise tolerance and peak sharpness.


Applied Optics | 1992

Tutorial survey of composite filter designs for optical correlators

B. V. K. Vijaya Kumar

A tutorial survey is presented of the many composite filter designs proposed for distortion-invariant optical pattern recognition. Remarks are made throughout regarding areas for further investigation.


Journal of The Optical Society of America A-optics Image Science and Vision | 1986

Minimum-variance synthetic discriminant functions

B. V. K. Vijaya Kumar

The conventional synthetic discriminant functions (SDF’s) determine a filter matched to a linear combination of the available training images such that the resulting cross-correlation output is constant for all training images. We remove the constraint that the filter must be matched to a linear combination of training images and consider a general solution. This general solution is, however, still a linear combination of modified training images. We investigate the effects of noise in input training images and prove that the conventional SDF’s provide minimum output variance when the input noise is white. We provide the design equations for minimum-variance synthetic discriminant functions (MVSDF’s) when the input noise is colored. General expressions are also provided to characterize the loss of optimality when conventional SDF’s are used instead of optimal MVSDF’s.


Applied Optics | 1994

Unconstrained correlation filters

Abhijit Mahalanobis; B. V. K. Vijaya Kumar; Sewoong Song; S. R. F. Sims; Jim F. Epperson

A mathematical analysis of the distortion tolerance in correlation filters is presented. A good measure for distortion performance is shown to be a generalization of the minimum average correlation energy criterion. To optimize the filters performance, we remove the usual hard constraints on the outputs in the synthetic discriminant function formulation. The resulting filters exhibit superior distortion tolerance while retaining the attractive features of their predecessors such as the minimum average correlation energy filter and the minimum variance synthetic discriminant function filter. The proposed theory also unifies several existing approaches and examines the relationship between different formulations. The proposed filter design algorithm requires only simple statistical parameters and the inversion of diagonal matrices, which makes it attractive from a computational standpoint. Several properties of these filters are discussed with illustrative examples.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 1982

Efficient Calculation of Primary Images from a Set of Images

Hiroyasu Murakami; B. V. K. Vijaya Kumar

A set of images is modeled as a stochastic process and Karhunen-Loeve expansion is applied to extract the feature images. Although the size of the correlation matrix for such a stochastic process is very large, we show the way to calculate the eigenvectors when the rank of the correlation matrix is not large. We also propose an iterative algorithm to calculate the eigenvectors which save computation time andc omputer storage requirements. This iterative algorithm gains its efficiency from the fact that only a significant set of eigenvectors are retained at any stage of iteration. Simulation results are also presented to verify these methods.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2007

A Bayesian Approach to Deformed Pattern Matching of Iris Images

Jason Thornton; Marios Savvides; B. V. K. Vijaya Kumar

We describe a general probabilistic framework for matching patterns that experience in-plane nonlinear deformations, such as iris patterns. Given a pair of images, we derive a maximum a posteriori probability (MAP) estimate of the parameters of the relative deformation between them. Our estimation process accomplishes two things simultaneously: it normalizes for pattern warping and it returns a distortion-tolerant similarity metric which can be used for matching two nonlinearly deformed image patterns. The prior probability of the deformation parameters is specific to the pattern-type and, therefore, should result in more accurate matching than an arbitrary general distribution. We show that the proposed method is very well suited for handling iris biometrics, applying it to two databases of iris images which contain real instances of warped patterns. We demonstrate a significant improvement in matching accuracy using the proposed deformed Bayesian matching methodology. We also show that the additional computation required to estimate the deformation is relatively inexpensive, making it suitable for real-time applications


IEEE Transactions on Biomedical Engineering | 2012

Heartbeat Classification Using Morphological and Dynamic Features of ECG Signals

Can Ye; B. V. K. Vijaya Kumar; Miguel Tavares Coimbra

In this paper, we propose a new approach for heartbeat classification based on a combination of morphological and dynamic features. Wavelet transform and independent component analysis (ICA) are applied separately to each heartbeat to extract morphological features. In addition, RR interval information is computed to provide dynamic features. These two different types of features are concatenated and a support vector machine classifier is utilized for the classification of heartbeats into one of 16 classes. The procedure is independently applied to the data from two ECG leads and the two decisions are fused for the final classification decision. The proposed method is validated on the baseline MIT-BIH arrhythmia database and it yields an overall accuracy (i.e., the percentage of heartbeats correctly classified) of 99.3% (99.7% with 2.4% rejection) in the “class-oriented” evaluation and an accuracy of 86.4% in the “subject-oriented” evaluation, comparable to the state-of-the-art results for automatic heartbeat classification.


Lecture Notes in Computer Science | 2003

Illumination normalization using logarithm transforms for face authentication

Marios Savvides; B. V. K. Vijaya Kumar

In this paper we propose an algorithm that can easily be implemented on small form factor devices to perform illumination normalization in face images captured under various lighting conditions for face verification. We show that Logarithm transformations on images suffering from significant illumination variation, produce face images that are improved substantially for performing face authentication. We present illumination normalized images from the CMU PIE database to demonstrate the improvement using this non-linear preprocessing approach. We show that we get improved face verification performance using this scheme when training on frontal illuminated faces images, and testing on images captured under variable illumination.


Pattern Recognition | 2003

Face authentication for multiple subjects using eigenflow

Xiaoming Liu; Tsuhan Chen; B. V. K. Vijaya Kumar

Abstract In this paper, we present a novel scheme for face authentication. To deal with variations, such as facial expressions and registration errors, with which traditional intensity-based methods do not perform well, we propose the eigenflow approach. In this approach, the optical flow and the optical flow residue between a test image and an image in the training set are first computed. The optical flow is then fitted to a model that is pre-trained by applying principal component analysis to optical flows resulting from facial expressions and registration errors for the subject. The eigenflow residue, optimally combined with the optical flow residue using linear discriminant analysis, determines the authenticity of the test image. An individual modeling method and a common modeling method are described. We also present a method to optimally choose the threshold for each subject for a multiple-subject authentication system. Experimental results show that the proposed scheme outperforms the traditional methods in the presence of facial expression variations and registration errors.

Collaboration


Dive into the B. V. K. Vijaya Kumar's collaboration.

Top Co-Authors

Avatar

Marios Savvides

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

David Casasent

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andres Rodriguez

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Zouhir Bahri

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Jason Thornton

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Chunyan Xie

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Seungjune Jeon

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Yongjune Kim

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Arun Ross

Michigan State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge