Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gari D. Clifford is active.

Publication


Featured researches published by Gari D. Clifford.


IEEE Transactions on Biomedical Engineering | 2003

A dynamical model for generating synthetic electrocardiogram signals

Patrick E. McSharry; Gari D. Clifford; Lionel Tarassenko; Leonard A. Smith

A dynamical model based on three coupled ordinary differential equations is introduced which is capable of generating realistic synthetic electrocardiogram (ECG) signals. The operator can specify the mean and standard deviation of the heart rate, the morphology of the PQRST cycle, and the power spectrum of the RR tachogram. In particular, both respiratory sinus arrhythmia at the high frequencies (HFs) and Mayer waves at the low frequencies (LFs) together with the LF/HF ratio are incorporated in the model. Much of the beat-to-beat variation in morphology and timing of the human ECG, including QT dispersion and R-peak amplitude modulation are shown to result. This model may be employed to assess biomedical signal processing techniques which are used to compute clinical statistics from the ECG.


Critical Care Medicine | 2011

Multiparameter Intelligent Monitoring in Intensive Care II: A public-access intensive care unit database*

Mohammed Saeed; Mauricio Villarroel; Andrew T. Reisner; Gari D. Clifford; Li-wei H. Lehman; George B. Moody; Thomas Heldt; Tin H. Kyaw; Benjamin Moody; Roger G. Mark

Objective:We sought to develop an intensive care unit research database applying automated techniques to aggregate high-resolution diagnostic and therapeutic data from a large, diverse population of adult intensive care unit patients. This freely available database is intended to support epidemiologic research in critical care medicine and serve as a resource to evaluate new clinical decision support and monitoring algorithms. Design:Data collection and retrospective analysis. Setting:All adult intensive care units (medical intensive care unit, surgical intensive care unit, cardiac care unit, cardiac surgery recovery unit) at a tertiary care hospital. Patients:Adult patients admitted to intensive care units between 2001 and 2007. Interventions:None. Measurements and Main Results:The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database consists of 25,328 intensive care unit stays. The investigators collected detailed information about intensive care unit patient stays, including laboratory data, therapeutic intervention profiles such as vasoactive medication drip rates and ventilator settings, nursing progress notes, discharge summaries, radiology reports, provider order entry data, International Classification of Diseases, 9th Revision codes, and, for a subset of patients, high-resolution vital sign trends and waveforms. Data were automatically deidentified to comply with Health Insurance Portability and Accountability Act standards and integrated with relational database software to create electronic intensive care unit records for each patient stay. The data were made freely available in February 2010 through the Internet along with a detailed users guide and an assortment of data processing tools. The overall hospital mortality rate was 11.7%, which varied by critical care unit. The median intensive care unit length of stay was 2.2 days (interquartile range, 1.1–4.4 days). According to the primary International Classification of Diseases, 9th Revision codes, the following disease categories each comprised at least 5% of the case records: diseases of the circulatory system (39.1%); trauma (10.2%); diseases of the digestive system (9.7%); pulmonary diseases (9.0%); infectious diseases (7.0%); and neoplasms (6.8%). Conclusions:MIMIC-II documents a diverse and very large population of intensive care unit patient stays and contains comprehensive and detailed clinical data, including physiological waveforms and minute-by-minute trends for a subset of records. It establishes a new public-access resource for critical care research, supporting a diverse range of analytic studies spanning epidemiology, clinical decision-rule development, and electronic tool development.


IEEE Transactions on Biomedical Engineering | 2007

A Nonlinear Bayesian Filtering Framework for ECG Denoising

Reza Sameni; Mohammad Bagher Shamsollahi; Christian Jutten; Gari D. Clifford

In this paper, a nonlinear Bayesian filtering framework is proposed for the filtering of single channel noisy electrocardiogram (ECG) recordings. The necessary dynamic models of the ECG are based on a modified nonlinear dynamic model, previously suggested for the generation of a highly realistic synthetic ECG. A modified version of this model is used in several Bayesian filters, including the Extended Kalman Filter, Extended Kalman Smoother, and Unscented Kalman Filter. An automatic parameter selection method is also introduced, to facilitate the adaptation of the model parameters to a vast variety of ECGs. This approach is evaluated on several normal ECGs, by artificially adding white and colored Gaussian noises to visually inspected clean ECG recordings, and studying the SNR and morphology of the filter outputs. The results of the study demonstrate superior results compared with conventional ECG denoising approaches such as bandpass filtering, adaptive filtering, and wavelet denoising, over a wide range of ECG SNRs. The method is also successfully evaluated on real nonstationary muscle artifact. This method may therefore serve as an effective framework for the model-based filtering of noisy ECG recordings.


IEEE Transactions on Biomedical Engineering | 2005

Quantifying errors in spectral estimates of HRV due to beat replacement and resampling

Gari D. Clifford; Lionel Tarassenko

Spectral estimates of heart rate variability (HRV) often involve the use of techniques such as the fast Fourier transform (FFT), which require an evenly sampled time series. HRV is calculated from the variations in the beat-to-beat (RR) interval timing of the cardiac cycle which are inherently irregularly spaced in time. In order to produce an evenly sampled time series prior to FFT-based spectral estimation, linear or cubic spline resampling is usually employed. In this paper, by using a realistic artificial RR interval generator, interpolation and resampling is shown to result in consistent over-estimations of the power spectral density (PSD) compared with the theoretical solution. The Lomb-Scargle (LS) periodogram, a more appropriate spectral estimation technique for unevenly sampled time series that uses only the original data, is shown to provide a superior PSD estimate. Ectopy removal or replacement is shown to be essential regardless of the spectral estimation technique. Resampling and phantom beat replacement is shown to decrease the accuracy of PSD estimation, even at low levels of ectopy or artefact. A linear relationship between the frequency of ectopy/artefact and the error (mean and variance) of the PSD estimate is demonstrated. Comparisons of PSD estimation techniques performed on real RR interval data during minimally active segments (sleep) demonstrate that the LS periodogram provides a less noisy spectral estimate of HRV.


The Open Pacing, Electrophysiology & Therapy Journal | 2010

A Review of Fetal ECG Signal Processing; Issues and Promising Directions.

Reza Sameni; Gari D. Clifford

The field of electrocardiography has been in existence for over a century, yet despite significant advances in adult clinical electrocardiography, signal processing techniques and fast digital processors, the analysis of fetal ECGs is still in its infancy. This is, partly due to a lack of availability of gold standard databases, partly due to the relatively low signal-to-noise ratio of the fetal ECG compared to the maternal ECG (caused by the various media between the fetal heart and the measuring electrodes, and the fact that the fetal heart is simply smaller), and in part, due to the less complete clinical knowledge concerning fetal cardiac function and development. In this paper we review a range of promising recording and signal processing techniques for fetal ECG analysis that have been developed over the last forty years, and discuss both their shortcomings and advantages. Before doing so, however, we review fetal cardiac development, and the etiology of the fetal ECG. A selection of relevant models for the fetal/maternal ECG mixture is also discussed. In light of current understanding of the fetal ECG, we then attempt to justify recommendations for promising future directions in signal processing, and database creation.


Neural Computing and Applications | 2006

Application of independent component analysis in removing artefacts from the electrocardiogram

Taigang He; Gari D. Clifford; Lionel Tarassenko

Routinely recorded electrocardiograms (ECGs) are often corrupted by different types of artefacts and many efforts have been made to enhance their quality by reducing the noise or artefacts. This paper addresses the problem of removing noise and artefacts from ECGs using independent component analysis (ICA). An ICA algorithm is tested on three-channel ECG recordings taken from human subjects, mostly in the coronary care unit. Results are presented that show that ICA can detect and remove a variety of noise and artefact sources in these ECGs. One difficulty with the application of ICA is the determination of the order of the independent components. A new technique based on simple statistical parameters is proposed to solve this problem in this application. The developed technique is successfully applied to the ECG data and offers potential for online processing of ECG using ICA.


EURASIP Journal on Advances in Signal Processing | 2007

Multichannel ECG and noise modeling: application to maternal and fetal ECG signals

Reza Sameni; Gari D. Clifford; Christian Jutten; Mohammad Bagher Shamsollahi

A three-dimensional dynamic model of the electrical activity of the heart is presented. The model is based on the single dipole model of the heart and is later related to the body surface potentials through a linear model which accounts for the temporal movements and rotations of the cardiac dipole, together with a realistic ECG noise model. The proposed model is also generalized to maternal and fetal ECG mixtures recorded from the abdomen of pregnant women in single and multiple pregnancies. The applicability of the model for the evaluation of signal processing algorithms is illustrated using independent component analysis. Considering the difficulties and limitations of recording long-term ECG data, especially from pregnant women, the model described in this paper may serve as an effective means of simulation and analysis of a wide range of ECGs, including adults and fetuses.


IEEE Transactions on Biomedical Engineering | 2013

ECG Signal Quality During Arrhythmia and Its Application to False Alarm Reduction

Joachim Behar; Julien Oster; Qiao Li; Gari D. Clifford

An automated algorithm to assess electrocardiogram (ECG) quality for both normal and abnormal rhythms is presented for false arrhythmia alarm suppression of intensive care unit (ICU) monitors. A particular focus is given to the quality assessment of a wide variety of arrhythmias. Data from three databases were used: the Physionet Challenge 2011 dataset, the MIT-BIH arrhythmia database, and the MIMIC II database. The quality of more than 33 000 single-lead 10 s ECG segments were manually assessed and another 12 000 bad-quality single-lead ECG segments were generated using the Physionet noise stress test database. Signal quality indices (SQIs) were derived from the ECGs segments and used as the inputs to a support vector machine classifier with a Gaussian kernel. This classifier was trained to estimate the quality of an ECG segment. Classification accuracies of up to 99% on the training and test set were obtained for normal sinus rhythm and up to 95% for arrhythmias, although performance varied greatly depending on the type of rhythm. Additionally, the association between 4050 ICU alarms from the MIMIC II database and the signal quality, as evaluated by the classifier, was studied. Results suggest that the SQIs should be rhythm specific and that the classifier should be trained for each rhythm call independently. This would require a substantially increased set of labeled data in order to train an accurate algorithm.


Physiological Measurement | 2012

Dynamic time warping and machine learning for signal quality assessment of pulsatile signals

Qiao Li; Gari D. Clifford

In this work, we describe a beat-by-beat method for assessing the clinical utility of pulsatile waveforms, primarily recorded from cardiovascular blood volume or pressure changes, concentrating on the photoplethysmogram (PPG). Physiological blood flow is nonstationary, with pulses changing in height, width and morphology due to changes in heart rate, cardiac output, sensor type and hardware or software pre-processing requirements. Moreover, considerable inter-individual and sensor-location variability exists. Simple template matching methods are therefore inappropriate, and a patient-specific adaptive initialization is therefore required. We introduce dynamic time warping to stretch each beat to match a running template and combine it with several other features related to signal quality, including correlation and the percentage of the beat that appeared to be clipped. The features were then presented to a multi-layer perceptron neural network to learn the relationships between the parameters in the presence of good- and bad-quality pulses. An expert-labeled database of 1055 segments of PPG, each 6 s long, recorded from 104 separate critical care admissions during both normal and verified arrhythmic events, was used to train and test our algorithms. An accuracy of 97.5% on the training set and 95.2% on test set was found. The algorithm could be deployed as a stand-alone signal quality assessment algorithm for vetting the clinical utility of PPG traces or any similar quasi-periodic signal.


IEEE Transactions on Biomedical Engineering | 2014

Ventricular Fibrillation and Tachycardia Classification Using a Machine Learning Approach

Qiao Li; Cadathur Rajagopalan; Gari D. Clifford

Correct detection and classification of ventricular fibrillation (VF) and rapid ventricular tachycardia (VT) is of pivotal importance for an automatic external defibrillator and patient monitoring. In this paper, a VF/VT classification algorithm using a machine learning method, a support vector machine, is proposed. A total of 14 metrics were extracted from a specific window length of the electrocardiogram (ECG). A genetic algorithm was then used to select the optimal variable combinations. Three annotated public domain ECG databases (the American Heart Association Database, the Creighton University Ventricular Tachyarrhythmia Database, and the MIT-BIH Malignant Ventricular Arrhythmia Database) were used as training, test, and validation datasets. Different window sizes, varying from 1 to 10 s were tested. An accuracy (Ac) of 98.1%, sensitivity (Se) of 98.4%, and specificity (Sp) of 98.0% were obtained on the in-sample training data with 5 s-window size and two selected metrics. On the out-of-sample validation data, an Ac of 96.3% ± 3.4%, Se of 96.2% ± 2.7%, and Sp of 96.2% ± 4.6% were obtained by fivefold cross validation. The results surpass those of current reported methods.

Collaboration


Dive into the Gari D. Clifford's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joachim Behar

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roger G. Mark

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge