Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andy Adler is active.

Publication


Featured researches published by Andy Adler.


Physiological Measurement | 2009

GREIT: A unified approach to 2D linear EIT reconstruction of lung images

Andy Adler; John H. Arnold; Richard Bayford; Andrea Borsic; B H Brown; Paul Dixon; Theo J.C. Faes; Inéz Frerichs; Hervé Gagnon; Yvo Gärber; Bartłomiej Grychtol; G. Hahn; William R. B. Lionheart; Anjum Malik; Robert Patterson; Janet Stocks; Andrew Tizzard; Norbert Weiler; Gerhard K. Wolf

Electrical impedance tomography (EIT) is an attractive method for clinically monitoring patients during mechanical ventilation, because it can provide a non-invasive continuous image of pulmonary impedance which indicates the distribution of ventilation. However, most clinical and physiological research in lung EIT is done using older and proprietary algorithms; this is an obstacle to interpretation of EIT images because the reconstructed images are not well characterized. To address this issue, we develop a consensus linear reconstruction algorithm for lung EIT, called GREIT (Graz consensus Reconstruction algorithm for EIT). This paper describes the unified approach to linear image reconstruction developed for GREIT. The framework for the linear reconstruction algorithm consists of (1) detailed finite element models of a representative adult and neonatal thorax, (2) consensus on the performance figures of merit for EIT image reconstruction and (3) a systematic approach to optimize a linear reconstruction matrix to desired performance measures. Consensus figures of merit, in order of importance, are (a) uniform amplitude response, (b) small and uniform position error, (c) small ringing artefacts, (d) uniform resolution, (e) limited shape deformation and (f) high resolution. Such figures of merit must be attained while maintaining small noise amplification and small sensitivity to electrode and boundary movement. This approach represents the consensus of a large and representative group of experts in EIT algorithm design and clinical applications for pulmonary monitoring. All software and data to implement and test the algorithm have been made available under an open source license which allows free research and commercial use.


Journal of Clinical Investigation | 1999

The late, but not early, asthmatic response is dependent on IL-5 and correlates with eosinophil infiltration

Grzegorz Cieslewicz; Adrian Tomkinson; Andy Adler; Catherine Duez; Jürgen Schwarze; Katsuyuki Takeda; Kirsten A. Larson; James J. Lee; Charles G. Irvin; Erwin W. Gelfand

Early-phase reactions (EPRs) and late-phase reactions (LPRs) are characteristic features of bronchial asthma, although the pathogenetic mechanisms responsible for each of the responses are not fully defined. A murine model of EPRs and LPRs was developed to investigate the role of IL-5 and eosinophils in development of both responses. After initial intraperitoneal sensitization and airway challenge to ovalbumin (OVA), mice were provoked by additional exposure to OVA. An EPR, characterized by a transient increase in airway responsiveness, was observed 5-30 minutes after antigen provocation. This response was followed by an LPR that reached its maximum at 6 hours after challenge and was characterized by increased airway responsiveness and significant lung eosinophilia. The EPR was blocked by cromoglycate and albuterol, whereas the LPR was abolished by cromoglycate and hydrocortisone. Before provocation with allergen, administration of anti-IL-5 antibody prevented the influx of eosinophils into the lung tissue and abolished the LPR but not EPR. These results suggest that IL-5 and eosinophils are essential for development of the LPR, but not EPR, in this model.


IEEE Transactions on Medical Imaging | 1996

Electrical impedance tomography: regularized imaging and contrast detection

Andy Adler; Robert Guardo

Dynamic electrical impedance tomography (EIT) images changes in the conductivity distribution of a medium from low frequency electrical measurements made at electrodes on the medium surface. Reconstruction of the conductivity distribution is an under-determined and ill-posed problem, typically requiring either simplifying assumptions or regularization based on a priori knowledge. This paper presents a maximum a posteriori (MAP) approach to linearized image reconstruction using knowledge of the noise variance of the measurements and the covariance of the conductivity distribution. This approach has the advantage of an intuitive interpretation of the algorithm parameters as well as fast (near real time) image reconstruction. In order to compare this approach to existing algorithms, the authors develop figures of merit to measure the reconstructed image resolution, the noise amplification of the image reconstruction, and the fidelity of positioning in the image. Finally, the authors develop a communications systems approach to calculate the probability of detection of a conductivity contrast in the reconstructed image as a function of the measurement noise and the reconstruction algorithm used.


Lecture Notes in Computer Science | 2005

Vulnerabilities in biometric encryption systems

Andy Adler

The goal of a biometric encryption system is to embed a secret into a biometric template in a way that can only be decrypted with a biometric image from the enroled person. This paper describes a potential vulnerability in such systems that allows a less-than-brute force regeneration of the secret and an estimate of the enrolled image. This vulnerability requires the biometric comparison to “leak” some information from which an analogue for a match score may be calculated. Using this match score value, a “hill-climbing” attack is performed against the algorithm to calculate an estimate of the enrolled image, which is then used to decrypt the code. Results are shown against a simplified implementation of the algorithm of Soutar et al. (1998).


canadian conference on electrical and computer engineering | 2003

Sample images can be independently restored from face recognition templates

Andy Adler

Biometrics promise the ability to automatically identify individuals from reasonably easy to measure and hard to falsify characteristics. They are increasingly being investigated for use in large scale identification applications in the context of increased national security awareness. This paper addresses some of the security and privacy implications of biometric storage. Biometric systems record a sample image, and calculate a template: a compact digital representation of the essential features of the image. To compare the individuals represented by two images, the corresponding templates are compared, and a match score calculated, indicating the confidence level that the images represent the same individual. Biometrics vendors have uniformly claimed that it is impossible or infeasible to recreate an image from a template, and therefore, templates are currently treated as nonidentifiable data. We describe a simple algorithm which allows recreation of a sample image from a face recognition template using only match score values. At each iteration, a candidate image is slightly modified by an eigenface image, and modifications which improve the match score are kept. The regenerated image compares with high score to the original image, and visually shows most of the essential features. This image could thus be used to fool the algorithm as the target person, or to visually identify that individual. Importantly, this algorithm is immune to template encryption: any system which allows access to match scores effectively allows sample images to be regenerated in this way.


IEEE Transactions on Medical Imaging | 2010

In Vivo Impedance Imaging With Total Variation Regularization

Andrea Borsic; Brad M. Graham; Andy Adler; William R. B. Lionheart

We show that electrical impedance tomography (EIT) image reconstruction algorithms with regularization based on the total variation (TV) functional are suitable for in vivo imaging of physiological data. This reconstruction approach helps to preserve discontinuities in reconstructed profiles, such as step changes in electrical properties at interorgan boundaries, which are typically smoothed by traditional reconstruction algorithms. The use of the TV functional for regularization leads to the minimization of a nondifferentiable objective function in the inverse formulation. This cannot be efficiently solved with traditional optimization techniques such as the Newton method. We explore two implementations methods for regularization with the TV functional: the lagged diffusivity method and the primal dual-interior point method (PD-IPM). First we clarify the implementation details of these algorithms for EIT reconstruction. Next, we analyze the performance of these algorithms on noisy simulated data. Finally, we show reconstructed EIT images of in vivo data for ventilation and gastric emptying studies. In comparison to traditional quadratic regularization, TV regulariza tion shows improved ability to reconstruct sharp contrasts.


Progress in Electromagnetics Research-pier | 2009

Four-dimensional electrical capacitance tomography imaging using experimental data

Manuchehr Soleimani; Cathryn N. Mitchell; Robert Banasiak; R. Wajman; Andy Adler

Electrical capacitance tomography (ECT) is a relatively mature non-invasive imaging technique that attempts to map dielectric permittivity of materials. ECT has become a promising monitoring technique in industrial process tomography especially in fast flow visualization. One of the most challenging tasks in further development of ECT for real applications are the computational aspects of the ECT imaging. Recently, 3D ECT has gained interest because of its potential to generate volumetric images. Computation time of image reconstruction in 3D ECT makes it more difficult for real time applications. In this paper we present a robust and computationally efficient 4D image reconstruction algorithm applied to real ECT data. The method takes advantage of the temporal correlation between 3D ECT frames to reconstruct movies of dielectric maps. Image reconstruction results are presented for the proposed algorithms for experimental ECT data of a rapidly moving object.


Archive | 2008

Biometric System Security

Andy Adler

Security is “freedom from risk or danger”, while computer and data security is “the ability of a system to protect information and system resources with respect to confidentiality and integrity”. Defining biometrics system security is difficult, because of the ways biometric systems differ from traditional computer and cryptographic security [40]. Implicit in all definitions is the concept of an attacker; however, biometrics should always be assumed to operate in an (at least somewhat) hostile environment – after all, why should one test identity if all can be trusted? The ability of a biometric system to stand up to “zero-effort” attackers is measured by the false accept rate (FAR). Attackers may then change makeup, facial hair and glasses, or abrade and cut fingerprints in order to avoid being recognized; attackers prepared to try harder may use spoofing . This chapter deals with attacks which are not spoofing, but those that target processing within the biometric system. We define biometric system security by its absence. Since biometrics is “automated recognition of individuals based on their behavioral and biological characteristics”, a vulnerability in biometric security results in incorrect recognition or failure to correctly recognize individuals. This definition includes methods to falsely accept an individual (template regeneration), impact overall system performance (denial of service), or to attack another system via leaked data (identity theft). Vulnerabilities are measured against explicit or implicit design claims.


Physiological Measurement | 2006

Objective selection of hyperparameter for EIT

Brad M. Graham; Andy Adler

An algorithm for objectively calculating the hyperparameter for linearized one-step electrical impedance tomography (EIT) image reconstruction algorithms is proposed and compared to existing strategies. EIT is an ill-conditioned problem in which regularization is used to calculate a stable and accurate solution by incorporating some form of prior knowledge into the solution. A hyperparameter is used to control the trade-off between conformance to data and conformance to the prior. A remaining challenge is to develop and validate methods of objectively selecting the hyperparameter. In this paper, we evaluate and compare five different strategies for hyperparameter selection. We propose a calibration-based method of objective hyperparameter selection, called BestRes, that leads to repeatable and stable image reconstructions that are indistinguishable from heuristic selections. Results indicate: (1) heuristic selections of hyperparameter are inconsistent among experts, (2) generalized cross-validation approaches produce under-regularized solutions, (3) L-curve approaches are unreliable for EIT and (4) BestRes produces good solutions comparable to expert selections. Additionally, we show that it is possible to reliably detect an inverse crime based on analysis of these parameters.


Thorax | 2017

Chest electrical impedance tomography examination, data analysis, terminology, clinical use and recommendations: consensus statement of the TRanslational EIT developmeNt stuDy group

Inéz Frerichs; Marcelo B. P. Amato; Anton H. van Kaam; David G. Tingay; Zhanqi Zhao; Bartłomiej Grychtol; Marc Bodenstein; Hervé Gagnon; Stephan H. Bohm; Eckhard Teschner; O. Stenqvist; Tommaso Mauri; Vinicius Torsani; Luigi Camporota; Andreas Schibler; Gerhard K. Wolf; Diederik Gommers; Steffen Leonhardt; Andy Adler; Eddy Fan; William R. B. Lionheart; Thomas Riedel; Peter C. Rimensberger; Fernando Suarez Sipmann; Norbert Weiler; Hermann Wrigge

Electrical impedance tomography (EIT) has undergone 30 years of development. Functional chest examinations with this technology are considered clinically relevant, especially for monitoring regional lung ventilation in mechanically ventilated patients and for regional pulmonary function testing in patients with chronic lung diseases. As EIT becomes an established medical technology, it requires consensus examination, nomenclature, data analysis and interpretation schemes. Such consensus is needed to compare, understand and reproduce study findings from and among different research groups, to enable large clinical trials and, ultimately, routine clinical use. Recommendations of how EIT findings can be applied to generate diagnoses and impact clinical decision-making and therapy planning are required. This consensus paper was prepared by an international working group, collaborating on the clinical promotion of EIT called TRanslational EIT developmeNt stuDy group. It addresses the stated needs by providing (1) a new classification of core processes involved in chest EIT examinations and data analysis, (2) focus on clinical applications with structured reviews and outlooks (separately for adult and neonatal/paediatric patients), (3) a structured framework to categorise and understand the relationships among analysis approaches and their clinical roles, (4) consensus, unified terminology with clinical user-friendly definitions and explanations, (5) a review of all major work in thoracic EIT and (6) recommendations for future development (193 pages of online supplements systematically linked with the chief sections of the main document). We expect this information to be useful for clinicians and researchers working with EIT, as well as for industry producers of this technology.

Collaboration


Dive into the Andy Adler's collaboration.

Top Co-Authors

Avatar

Bartłomiej Grychtol

German Cancer Research Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Josep Solà

Swiss Center for Electronics and Microtechnology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Inéz Frerichs

University of Göttingen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert Guardo

École Polytechnique de Montréal

View shared research outputs
Researchain Logo
Decentralizing Knowledge