Catherine M. Grgicak
Boston University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Catherine M. Grgicak.
Journal of Forensic Sciences | 2013
Danielle Conklin; Elisse Coronado; Margaret Terrill; Robin W. Cotton; Catherine M. Grgicak
Determining appropriate analytical thresholds (ATs) for forensic DNA analysis is critical to maximize allele detection. In this study, six methods to determine ATs for forensic DNA purposes were examined and compared. Four of the methods rely on analysis of the baseline noise of a number of negatives, while two utilize the relationship between relative fluorescence unit signal and DNA input in the polymerase chain reaction (PCR) derived from a dilution series ranging from 1 to 0.06 ng. Results showed that when a substantial mass of DNA (i.e., >1 ng) was amplified, the baseline noise increased, suggesting the application of an AT derived from negatives should only be applied to samples with low levels of DNA. Further, the number and intensity of these noise peaks increased with increasing injection times, indicating that to maximize the ability to detect alleles, ATs should be validated for each post‐PCR procedure employed.
Journal of Forensic Sciences | 2010
Catherine M. Grgicak; Zena M. Urban; Robin W. Cotton
Abstract: Reproducibility of quantitative PCR results is dependent on the generation of consistent calibration curves via accurate volume transfers and instrument performance. A review of 14 standard curves, using two different QuantDuo® standard DNA lots, showed variability of cycle threshold values between assays were larger than those of the Internal PCR Control (IPC). This prompted a set of experiments designed to determine the source of variability. Results showed that error introduced during DNA addition to the plate resulted in little variation. A comparison of seven independent series demonstrated cycle threshold variation between dilutions was larger than the variation expected from repeated samples. Modeling the influence of pipette errors on dilution series accuracy indicated that a more rigorous approach to external calibration curve production is required and showed that improvement in calibration curve stability is expected if the pipette conditions are carefully chosen and/or a single validated curve is utilized as the calibrator.
Forensic Science International-genetics | 2015
Harish Swaminathan; Catherine M. Grgicak; Muriel Médard; Desmond S. Lun
Repetitive sequences in the human genome called short tandem repeats (STRs) are used in human identification for forensic purposes. Interpretation of DNA profiles generated using STRs is often problematic because of uncertainty in the number of contributors to the sample. Existing methods to identify the number of contributors work on the number of peaks observed and/or allele frequencies. We have developed a computational method called NOCIt that calculates the a posteriori probability (APP) on the number of contributors. NOCIt works on single source calibration data consisting of known genotypes to compute the APP for an unknown sample. The method takes into account signal peak heights, population allele frequencies, allele dropout and stutter-a commonly occurring PCR artifact. We tested the performance of NOCIt using 278 experimental and 40 simulated DNA mixtures consisting of one to five contributors with total DNA mass from 0.016 to 0.25ng. NOCIt correctly identified the number of contributors in 83% of the experimental samples and in 85% of the simulated mixtures, while the accuracy of the best pre-existing method to determine the number of contributors was 72% for the experimental samples and 73% for the simulated mixtures. Moreover, NOCIt calculated the APP for the true number of contributors to be at least 1% in 95% of the experimental samples and in all the simulated mixtures.
Journal of Materials Chemistry | 2006
Catherine M. Grgicak; Richard G. Green; Javier B. Giorgi
Preparation of mixed metal oxide precursors for solid oxide fuel cells represents a very complex chemical process in which a metal may form oxides, hydroxides and various complex basic salts as intermediates. A detailed study to determine the relationship between synthesis strategies, morphology, sinteractivity and SOFC performance is a necessity. In this work, a direct relationship has been established between the precipitation agent, the calcination process, the final composition, particle sizes, sinterability and solid oxide fuel cell (SOFC) performance for nickel, copper and cobalt based anode materials. Nickel, copper and cobalt yttria stabilized zirconia (NiYSZ, CuYSZ and CoYSZ) anode materials were synthesized via hydrolysis of the corresponding chloride solutions with NH3, NH3 + NaOH and NaOH as precipitation agents. The formation pathway was established for the various products by the direct observation of intermediate species throughout the synthesis process. A comparison of the powders indicates that the choice of precipitation agent greatly influences the final characteristics. The cobalt anodes offered the highest SOFC performance, while within each metal system, the anodes with a crystalline precursor resulted in higher exchange current densities for the charge transfer portion of the impedance spectra.
Forensic Science International-genetics | 2016
Harish Swaminathan; Abhishek Garg; Catherine M. Grgicak; Muriel Médard; Desmond S. Lun
In forensic DNA interpretation, the likelihood ratio (LR) is often used to convey the strength of a match. Expanding on binary and semi-continuous methods that do not use all of the quantitative data contained in an electropherogram, fully continuous methods to calculate the LR have been created. These fully continuous methods utilize all of the information captured in the electropherogram, including the peak heights. Recently, methods that calculate the distribution of the LR using semi-continuous methods have also been developed. The LR distribution has been proposed as a way of studying the robustness of the LR, which varies depending on the probabilistic model used for its calculation. For example, the LR distribution can be used to calculate the p-value, which is the probability that a randomly chosen individual results in a LR greater than the LR obtained from the person-of-interest (POI). Hence, the p-value is a statistic that is different from, but related to, the LR; and it may be interpreted as the false positive rate resulting from a binary hypothesis test between the prosecution and defense hypotheses. Here, we present CEESIt, a method that combines the twin features of a fully continuous model to calculate the LR and its distribution, conditioned on the defense hypothesis, along with an associated p-value. CEESIt incorporates dropout, noise and stutter (reverse and forward) in its calculation. As calibration data, CEESIt uses single source samples with known genotypes and calculates a LR for a specified POI on a question sample, along with the LR distribution and a p-value. The method was tested on 303 files representing 1-, 2- and 3-person samples injected using three injection times containing between 0.016 and 1 ng of template DNA. Our data allows us to evaluate changes in the LR and p-value with respect to the complexity of the sample and to facilitate discussions regarding complex DNA mixture interpretation. We observed that the amount of template DNA from the contributor impacted the LR--small LRs resulted from contributors with low template masses. Moreover, as expected, we observed a decrease of p-values as the LR increased. A p-value of 10(-9) or lower was achieved in all the cases where the LR was greater than 10(8). We tested the repeatability of CEESIt by running all samples in duplicate and found the results to be repeatable.
Forensic Science International-genetics | 2015
Ulrich J. Monich; Ken R. Duffy; Muriel Médard; Viveck R. Cadambe; Lauren E. Alfonse; Catherine M. Grgicak
There are three dominant contributing factors that distort short tandem repeat profile measurements, two of which, stutter and variations in the allelic peak heights, have been described extensively. Here we characterise the remaining component, baseline noise. A probabilistic characterisation of the non-allelic noise peaks is not only inherently useful for statistical inference but is also significant for establishing a detection threshold. We do this by analysing the data from 643 single person profiles for the Identifiler Plus kit and 303 for the PowerPlex 16 HS kit. This investigation reveals that although the dye colour is a significant factor, it is not sufficient to have a per-dye colour description of the noise. Furthermore, we show that at a per-locus basis, out of the Gaussian, log-normal, and gamma distribution classes, baseline noise is best described by log-normal distributions and provide a methodology for setting an analytical threshold based on that deduction. In the PowerPlex 16 HS kit, we observe evidence of significant stutter at two repeat units shorter than the allelic peak, which has implications for the definition of baseline noise and signal interpretation. In general, the DNA input mass has an influence on the noise distribution. Thus, it is advisable to study noise and, consequently, to infer quantities like the analytical threshold from data with a DNA input mass comparable to the DNA input mass of the samples to be analysed.
Electrophoresis | 2017
Ken R. Duffy; Neil Gurram; Kelsey C. Peters; Genevieve Wellner; Catherine M. Grgicak
Short tandem repeat (STR) profiling from DNA samples has long been the bedrock of human identification. The laboratory process is composed of multiple procedures that include quantification, sample dilution, PCR, electrophoresis, and fragment analysis. The end product is a short tandem repeat electropherogram comprised of signal from allele, artifacts, and instrument noise. In order to optimize or alter laboratory protocols, a large number of validation samples must be created at significant expense. As a tool to support that process and to enable the exploration of complex scenarios without costly sample creation, a mechanistic stochastic model that incorporates each of the aforementioned processing features is described herein. The model allows rapid in silico simulation of electropherograms from multicontributor samples and enables detailed investigations of involved scenarios. An implementation of the model that is parameterized by extensive laboratory data is publically available. To illustrate its utility, the model was employed in order to evaluate the effects of sample dilutions, injection time, and cycle number on peak height, and the nature of stutter ratios at low template. We verify the models findings by comparison with experimentally generated data.
Journal of Forensic Sciences | 2013
Catherine M. Hennekens; Elyse S. Cooper; Robin W. Cotton; Catherine M. Grgicak
The purpose of this study was to determine the effect Proteinase K, sodium dodecyl sulfate (SDS), incubation times, and temperatures had on differential extraction efficiencies and the premature lysis of spermatozoa. The effect was measured using Quantifiler® Duo and Identifiler™ PCR Amplification kits, where the resultant male and female DNA concentrations and their ratios within the nonsperm‐ and sperm fractions (SFs) were determined. Comparisons between expected and observed ratios illustrate the quantity of female DNA in the SF increased when Proteinase K was absent during the initial incubation. Additionally, there is no indication of simultaneous sperm and epithelial cell lysis in the absence of DTT at Proteinase K concentrations ranging from 10 to 300 μg/mL. All other conditions exhibited minimal variation in DNA concentration. Therefore, despite the various protocols used for the differential lysis of cell mixtures encountered in casework, the method is robust and successful at most conditions.
asilomar conference on signals, systems and computers | 2014
Ullrich J. Mönich; Catherine M. Grgicak; Viveck R. Cadambe; Jason Yonglin Wu; Genevieve Wellner; Ken R. Duffy; Muriel Médard
For forensic purposes, short tandem repeat allele signals are used as DNA fingerprints. The interpretation of signals measured from samples has traditionally been conducted by applying thresholding. More quantitative approaches have recently been developed, but not for the purposes of identifying an appropriate signal model. By analyzing data from 643 single person samples, we develop such a signal model. Three standard classes of two-parameter distributions, one symmetric (normal) and two right-skewed (gamma and log-normal), were investigated for their ability to adequately describe the data. Our analysis suggests that additive noise is well modeled via the log-normal distribution class and that variability in peak heights is well described by the gamma distribution class. This is a crucial step towards the development of principled techniques for mixed sample signal deconvolution.
Journal of Forensic Sciences | 2016
Kayleigh E. Rowan; Genevieve Wellner; Catherine M. Grgicak
Impacts of validation design on DNA signal were explored, and the level of variation introduced by injection, capillary changes, amplification, and kit lot was surveyed by examining a set of replicate samples ranging in mass from 0.25 to 0.008 ng. The variations in peak height, heterozygous balance, dropout probabilities, and baseline noise were compared using common statistical techniques. Data indicate that amplification is the source of the majority of the variation observed in the peak heights, followed by capillary lots. The use of different amplification kit lots did not introduce variability into the peak heights, heterozygous balance, dropout, or baseline. Thus, if data from case samples run over a significant time period are not available during validation, the validation must be designed to, at a minimum, include the amplification of multiple samples of varying quantity, with known genotype, amplified and run over an extended period of time using multiple pipettes and capillaries.