Mark F. Tardiff
Pacific Northwest National Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mark F. Tardiff.
Molecular & Cellular Proteomics | 2014
Bobbie Jo M Webb-Robertson; Melissa M. Matzke; Susmita Datta; Samuel H. Payne; Jiyun Kang; Lisa Bramer; Carrie D. Nicora; Anil K. Shukla; Thomas O. Metz; Karin D. Rodland; Richard D. Smith; Mark F. Tardiff; Jason E. McDermott; Joel G. Pounds; Katrina M. Waters
As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that, with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian Proteoform Quantification model (BP-Quant)1 that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern or the existence of multiple overexpressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab® and R packages.
Journal of Proteome Research | 2014
Brett G. Amidan; Daniel J. Orton; Brian L. Lamarche; Matthew E. Monroe; Ronald J. Moore; Alexander M. Venzin; Richard D. Smith; Landon H. Sego; Mark F. Tardiff; Samuel H. Payne
Ensuring data quality and proper instrument functionality is a prerequisite for scientific investigation. Manual quality assurance is time-consuming and subjective. Metrics for describing liquid chromatography mass spectrometry (LC–MS) data have been developed; however, the wide variety of LC–MS instruments and configurations precludes applying a simple cutoff. Using 1150 manually classified quality control (QC) data sets, we trained logistic regression classification models to predict whether a data set is in or out of control. Model parameters were optimized by minimizing a loss function that accounts for the trade-off between false positive and false negative errors. The classifier models detected bad data sets with high sensitivity while maintaining high specificity. Moreover, the composite classifier was dramatically more specific than single metrics. Finally, we evaluated the performance of the classifier on a separate validation set where it performed comparably to the results for the testing/training data sets. By presenting the methods and software used to create the classifier, other groups can create a classifier for their specific QC regimen, which is highly variable lab-to-lab. In total, this manuscript presents 3400 LC–MS data sets for the same QC sample (whole cell lysate of Shewanella oneidensis), deposited to the ProteomeXchange with identifiers PXD000320–PXD000324.
intelligence and security informatics | 2013
Nathan A. Baker; Jonathan L. Barr; George T. Bonheyo; Cliff Joslyn; Kannan Krishnaswami; Mark E. Oxley; Rich Quadrel; Landon H. Sego; Mark F. Tardiff; Adam S. Wynne
In its most general form, a signature is a unique or distinguishing measurement, pattern, or collection of data that identifies a phenomenon (object, action, or behavior) of interest. The discovery of signatures is an important aspect of a wide range of disciplines from basic science to national security for the rapid and efficient detection and/or prediction of phenomena. Current practice in signature discovery is typically accomplished by asking domain experts to characterize and/or model individual phenomena to identify what might compose a useful signature. What is lacking is an approach that can be applied across a broad spectrum of domains and information sources to efficiently and robustly construct candidate signatures, validate their reliability, measure their quality, and overcome the challenge of detection - all in the face of dynamic conditions, measurement obfuscation, and noisy data environments. Our research has focused on the identification of common elements of signature discovery across application domains and the synthesis of those elements into a systematic process for more robust and efficient signature development. In this way, a systematic signature discovery process lays the groundwork for leveraging knowledge obtained from signatures to a particular domain or problem area, and, more generally, to problems outside that domain. This paper presents the initial results of this research by discussing a mathematical framework for representing signatures and placing that framework in the context of a systematic signature discovery process. Additionally, the basic steps of this process are described with details about the methods available to support the different stages of signature discovery, development, and deployment.
Computational Science & Discovery | 2014
Dennis G. Thomas; Satish Chikkagoudar; Alejandro Heredia-Langner; Mark F. Tardiff; Zhixiang Xu; Dennis E. Hourcade; Christine T. N. Pham; Gregory M. Lanza; Kilian Q. Weinberger; Nathan A. Baker
Nanoparticles are potentially powerful therapeutic tools that have the capacity to target drug payloads and imaging agents. However, some nanoparticles can activate complement, a branch of the innate immune system, and cause adverse side-effects. Recently, we employed an in vitro hemolysis assay to measure the serum complement activity of perfluorocarbon nanoparticles that differed by size, surface charge, and surface chemistry, quantifying the nanoparticle-dependent complement activity using a metric called Residual Hemolytic Activity (RHA). In the present work, we have used a decision tree learning algorithm to derive the rules for estimating nanoparticle-dependent complement response based on the data generated from the hemolytic assay studies. Our results indicate that physicochemical properties of nanoparticles, namely, size, polydispersity index, zeta potential, and mole percentage of the active surface ligand of a nanoparticle, can serve as good descriptors for prediction of nanoparticle-dependent complement activation in the decision tree modeling framework.
Sensors | 2008
Stephen J. Walsh; Lawrence K. Chilton; Mark F. Tardiff; Candace N. Metoyer
Detecting and identifying weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based radiance model, which describes at-sensor observed radiance. The background emissivity and plume/ground temperatures are isolated, and their effects on chemical signal are described. This analysis shows that the plumes physical state, emission or absorption, is directly dependent on the background emissivity and plume/ground temperatures. It then describes what conditions on the background emissivity and plume/ground temperatures have inhibiting or amplifying effects on the chemical signal. These claims are illustrated by analyzing synthetic hyperspectral imaging data with the adaptive matched filter using two chemicals and three distinct background emissivities.
ieee international conference on technologies for homeland security | 2007
C. Lo Presti; B. Milbrath; Mark F. Tardiff; S. Hartley-McBride
Algorithms based on time-series analysis techniques were explored for maximizing the effectiveness of pass-through radiation portal monitors for detection of special nuclear material (SNM). Time-series properties of vehicle count profiles such as stationarity and autocorrelation within energy windows were characterized. Vehicle count profiles were nonstationary but were found to be made stationary by first-differencing. Autocorrelation functions showed consistent differences between NORM alarm and non-alarm vehicles. Injection studies were performed to assess the performance of time-domain detection algorithms based on stationarity tests and on the CUSUM change-point detection test. Results indicated possible roles for detection algorithms based on statistical process control and on time series concepts.
ieee international conference on technologies for homeland security | 2009
Nicholas J. Lombardo; Christa K. Knudson; Richard M. Ozanich; Frederick C. Rutz; Surya V. Singh; Mark F. Tardiff; Mike Kemp; Michael Tierney
A concept has been developed for a next-generation integrated countermeasure architecture to detect improvised explosive devices hidden on people or left behind in unstructured crowds. The work is part of the Standoff Technology Integration and Demonstration Program of the U.S. Department of Homeland Securitys (DHSs) Science and Technology Directorate. The architecture uses a layered-defense approach that automates screening operations, prioritizes threats, and mobilizes resources accordingly. A system tracks people in motion, integrating and automating sensor control and scan acquisition to optimize threat-identification accuracy and allocation of screening resources. A threat-based decision module prioritizes screening targets based on user-defined rules. Operators manage system-wide risk and mobilize field teams for interdiction. DHS is working with industry on technology development and testing to achieve the required level of system integration and economics in crowd conditions.
2016 IEEE Symposium on Technologies for Homeland Security (HST) | 2016
Mark F. Tardiff; George T. Bonheyo; Katherine A. Cort; Thomas W. Edgar; Nancy J. Hess; William J. Hutton; Erin A. Miller; Kathleen Nowak; Christopher S. Oehmen; Emilie Purvine; Gregory K. Schenter; D. Paul
The cyber environment has rapidly evolved from a curiosity to an essential component of the contemporary world. As the cyber environment has expanded and become more complex, so have the nature of adversaries and styles of attacks. Today, cyber incidents are an expected part of life. As a result, cybersecurity research emerged to address adversarial attacks interfering with or preventing normal cyber activities. Historical response to cybersecurity attacks is heavily skewed to tactical responses with an emphasis on rapid recovery. While threat mitigation is important and can be time critical, a knowledge gap exists with respect to developing the science of cybersecurity. Such a science will enable the development and testing of theories that lead to understanding the broad sweep of cyber threats and the ability to assess trade-offs in sustaining network missions while mitigating attacks. The Asymmetric Resilient Cybersecurity Initiative at Pacific Northwest National Laboratory is a multi-year, multi-million dollar investment to develop approaches for shifting the advantage to the defender and sustaining the operability of systems under attack. The initiative established a Science Council to focus attention on the research process for cybersecurity. The Council shares science practices, critiques research plans, and aids in documenting and reporting reproducible research results. The Council members represent ecology, economics, statistics, physics, computational chemistry, microbiology and genetics, and geochemistry. This paper reports the initial work of the Science Council to implement the scientific method in cybersecurity research. The second section describes the scientific method. The third section in this paper discusses scientific practices for cybersecurity research. Section four describes initial impacts of applying the science practices to cybersecurity research.
ieee international conference on technologies for homeland security | 2013
Daniel M. Watkins; Landon H. Sego; Aimee E. Holmes; Bobbie-Jo M. Webb-Robertson; Amanda M. White; David S. Wunschel; Helen W. Kreuzer; Courtney D. Corley; Mark F. Tardiff
Chemical and biological forensic programs rely on laboratory measurements to determine how a threat agent may have been produced. In addition to laboratory analyses, it may also be useful to identify institutions where the same threat agent has been produced by the same (or a similar) process, since the producer of the agent may have learned methods at a university or similar institution. In this paper, we evaluate a Bayesian network that combines the results of laboratory measurements with evidence from scientific literature to probabilistically rank institutions that have published papers on the agent of interest. We apply techniques from multiattribute decision science to assess and compare the performance of various implementations of the Bayesian network in terms of three attributes: fidelity, consumption of the forensic sample, and document curation intensity. The mathematical approach we use to compare the various implementations is generalizable to the evaluation of other signature systems.
Archive | 2013
Aimee E. Holmes; Landon H. Sego; Bobbie-Jo M. Webb-Robertson; Helen W. Kreuzer; Richard M. Anderson; Stephen D. Unwin; Mark R. Weimar; Mark F. Tardiff; Courtney D. Corley
We demonstrate an approach for assessing the quality of a signature system designed to predict the culture medium used to grow a microorganism. The system was comprised of four chemical assays designed to identify various ingredients that could be used to produce the culture medium. The analytical measurements resulting from any combination of these four assays can be used in a Bayesian network to predict the probabilities that the microorganism was grown using one of eleven culture media. We evaluated combinations of the signature system by removing one or more of the assays from the Bayes network. We measured and compared the quality of the various Bayes nets in terms of fidelity, cost, risk, and utility, a method we refer to as Signature Quality Metrics