Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jane C. Asmuth is active.

Publication


Featured researches published by Jane C. Asmuth.


machine vision applications | 1996

A machine-vision system for iris recognition

Richard P. Wildes; Jane C. Asmuth; Gilbert L. Green; Steven C. Hsu; Raymond J. Kolczynski; James R. Matey; Sterling E. McBride

This paper describes a prototype system for personnel verification based on automated iris recognition. The motivation for this endevour stems from the observation that the human iris provides a particularly interesting structure on which to base a technology for noninvasive biometric measurement. In particular, it is known in the biomedical community that irises are as distinct as fingerprints or patterns of retinal blood vessels. Further, since the iris is an overt body, its appearance is amenable to remote examination with the aid of a machine-vision system. The body of this paper details the design and operation of such a system. Also presented are the results of an empirical study in which the system exhibits flawless performance in the evaluation of 520 iris images.


workshop on applications of computer vision | 1994

A system for automated iris recognition

Richard P. Wildes; Jane C. Asmuth; Gilbert L. Green; Stephen Charles Hsu; Raymond J. Kolczynski; James R. Matey; Sterling E. McBride

This paper describes a prototype system for personnel verification based on automated iris recognition. The motivation for this endeavour stems from the observation that the human iris provides a particularly interesting structure on which to base a technology for noninvasive biometric measurement. In particular, it is known in the biomedical community that irises are as distinct as fingerprints or patterns of retinal blood vessels. Further, since the iris is an overt body its appearance is amenable to remote examination with the aid of a computer vision system. The body of this paper details the design and operation of such a system. Also presented are the results of an empirical study where the system exhibits flawless performance in the evaluation of 520 iris images.<<ETX>>


IEEE Transactions on Medical Imaging | 1999

Validation of an optical flow method for tag displacement estimation

Lawrence Dougherty; Jane C. Asmuth; Aaron S. Blom; Leon Axel; Rakesh Kumar

Presents a validation study of an optical-flow method for the rapid estimation of myocardial displacement in magnetic resonance tagged cardiac images. This registration and change visualization (RCV) software uses a hierarchical estimation technique to compute the flow field that describes the warping of an image of one cardiac phase into alignment with the next. This method overcomes the requirement of constant pixel intensity in standard optical-flow methods by preprocessing the input images to reduce any intensity bias which results from the reduction in stripe contrast throughout the cardiac cycle. To validate the method, SPAMM-tagged images were acquired of a silicon gel phantom with simulated rotational motion. The pixel displacement was estimated with the RCV method and the error in pixel tracking was <4% 1000 ms after application of the tags, and after 30/spl deg/ of rotation. An additional study was performed using a SPAMM-tagged multiphase slice of a canine left ventricle. The true displacement was determined using a previously validated active contour model (snakes). The error between methods was 6.7% at end systole. The RCV method has the advantage of tracking all pixels in the image in a substantially shorter period than the snakes method.


international conference on pattern recognition | 1998

Registration of video to geo-referenced imagery

Rakesh Kumar; Harpreet S. Sawhney; Jane C. Asmuth; Art Pope; Steven C. Hsu

The ability to locate scenes and objects visible in aerial video imagery with their corresponding locations and coordinates in a reference coordinate system is important in visually-guided navigation, surveillance and monitoring systems. However, a key technical problem of locating objects and scenes in a video with their geo-coordinates needs to be solved in order to ascertain the geo-location of objects seen from the camera platforms current location. We present the key algorithms for the problem of accurate mapping between camera coordinates and geo-coordinates, called geo-spatial registration. Current systems for geo-location use the position and attitude information for the moving platform in some fixed world coordinates to locate the video frames in the reference database. However, the accuracy achieved is only of the order of 100s of pixels. Our approach utilizes the imagery and terrain information contained in the geo-spatial database to precisely align dynamic videos with the reference imagery and thus achieves a much higher accuracy. Applications of geo-spatial registration include aerial mapping, target location and tracking and enhanced visualization such as the overlay of textual/graphical annotations of objects of interest in the current video using the stored annotations in the reference database.


Journal of Computer Assisted Tomography | 2003

Improved radiologic staging of lung cancer with 2-[18F]-fluoro-2-deoxy-D-glucose-positron emission tomography and computed tomography Registration

Suzanne L. Aquino; Jane C. Asmuth; Nathaniel M. Alpert; Elkan F. Halpern; Alan J. Fischman

Purpose To determine if volumetric nonlinear registration or registration of thoracic computed tomography (CT) and 2-[18F]-fluoro-2-deoxy-d-glucose–positron emission tomography (FDG–PET) datasets changes the detection of mediastinal and hilar nodal disease in patients undergoing staging for lung cancer and if it has any impact on radiologic lung cancer staging. Method Computer-based image registration was performed on 45 clinical thoracic helical CT and FDG–PET scans of patients with lung cancer who were staged by mediastinoscopy and/or thoracotomy. Thoracic CT, FDG–PET, and registration datasets were each interpreted by 2 readers for the presence of metastatic nodal disease and were staged independently of each other. Results were compared with surgical pathologic findings. Results One hundred and thirty lymph node stations in the mediastinum and hila were evaluated each on CT, PET, and registration datasets. Sensitivity, specificity, positive predictive value, and negative predictive value, respectively, for detecting metastatic nodal disease for CT were 74%, 78%, 55%, 88%; for PET with CT side by side, 59% to 76%, 77% to 89%, 48% to 68%, and 84% to 91%; and for CT–PET registration, 71% to 76%, 89% to 96%, 70% to 86%, and 90% to 91%. Registration images were significantly more sensitive in detecting nodal disease over PET for 1 reader (P = 0.0156) and were more specific than PET (P = 0.0107 and 0.0017) in identifying the absence of mediastinal disease for both readers. Registration was significantly more accurate for staging when compared with PET for both readers (P = 0.002 and 0.035). Conclusion Registration of CT and FDG–PET datasets significantly improved the specificity of detecting metastatic disease. In addition, registration improved the radiologic staging of lung cancer patients when compared with CT or FDG–PET alone.


Academic Radiology | 2003

Alignment of CT lung volumes with an optical flow method

Lawrence Dougherty; Jane C. Asmuth; Warren B. Gefter

RATIONALE AND OBJECTIVES This study was performed to evaluate an optical flow method for registering serial computed tomographic (CT) images of lung volumes to assist physicians in visualizing and assessing changes between CT scans. MATERIALS AND METHODS The optical flow method is a coarse-to-fine model-based motion estimation technique for estimating first a global parametric transformation and then local deformations. Five serial pairs of CT images of lung volumes that were misaligned because of patient positioning, respiration, and/or different fields of view were used to test the method. RESULTS Lung volumes depicted on the serial paired images initially were correlated at only 28%-68% because of misalignment. With use of the optical flow method, the serial images were aligned to at least 95% correlation. CONCLUSION The optical flow method enables a direct comparison of serial CT images of lung volumes for the assessment of nodules or functional changes in the lung.


international conference on computer vision | 1999

Independent motion detection in 3D scenes

Harpreet S. Sawhney; Yanlin Guo; Jane C. Asmuth; Rakesh Kumar

Presents an algorithmic approach to the problem of detecting independently moving objects in 3D scenes that are viewed under camera motion. There are two fundamental constraints that can be exploited for the problem: (i) a two- (or multi-)view camera motion constraint (for instance, the epipolar/trilinear constraint), and (ii) a shape constancy constraint. Previous approaches to the problem either only used partial constraints or relied on dense correspondences or flow. We employ both of these fundamental constraints in an algorithm that does not demand a-priori availability of correspondences or flow. Our approach uses the plane-plus-parallax decomposition to enforce the two constraints. It is also demonstrated, for a class of scenes called sparse 3D scenes, in which genuine parallax and independent motions may be confounded, how the plane-plus-parallax decomposition allows progressive introduction and verification of the fundamental constraints. The results of applying the algorithm to some difficult sparse 3D scenes look promising.


Proceedings IEEE Workshop on Multi-View Modeling and Analysis of Visual Scenes (MVIEW'99) | 1999

Multi-view 3D estimation and applications to match move

Harpreet S. Sawhney; Y. Guo; Jane C. Asmuth; Rakesh Kumar

Multiple views of a rigid 3D scene captured from a moving camera can be used to estimate the 3D poses (rotations and translations) and the 3D structure of the unmodeled scene. In this problem domain, there are two key problems addressed in this paper: automatic Euclidean pose estimation in extended sequences even when views of the scene rapidly change, that is features come in and go out of view relatively rapidly; and insertion of synthetic 3D objects in the real scene and their authentic projection into the given real views (also called match move). The paper presents a number of examples of different camera motions to highlight the versatility of 3D estimation. Examples of match move are also presented.


Archive | 2002

Automatic registration of thoracic FDG-PET and CT for diagnosis and staging of lung cancer

Jane C. Asmuth; Richard H. Moore; Luca Bogoni; Suzanne L. Aquino

We developed an application to support physicians in the diagnosis and staging of lung cancer. FDG-PET scans are very sensitive to the existence of cancer. However, due to the relatively low anatomic resolution of PET, clinical scans are usually interpreted accompanied by a separate thoracic CT. Our application registers the PET scan to a preexisting diagnostic quality helical CT scan of the same patient, making the interpretation of the PET easier and faster.


medical image computing and computer assisted intervention | 2001

Ophthalmic Slitlamp-Based Computer-Aided Diagnosis: Image Processing Foundations

Luca Bogoni; Jane C. Asmuth; David J. Hirvonen; Bojidar Madjarov; Jeffrey W. Berger

Computer aided diagnosis and treatment of retinal disorders is enabled through an ophthalmic augmented reality environment being developed around the standard slitlamp biomicroscope. This system will allow the physician to view superimposed fundus photographic and angiographic data on the real-time slitlamp biomicroscopic image. The overlay capability requires real-time image acquisition, processing, mosaicking and comparison. Non-real time capabilities include the co-registration of similar or differing sources such as slitlamp biomicroscope, fundus photograph, or angiograms. Mosaicking enables the creation of montages from a collection of images and provides a context for robust registration and comparison. This paper outlines the image-processing architecture and controls to provide these functionalities.

Collaboration


Dive into the Jane C. Asmuth's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge