Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David C. Hamilton is active.

Publication


Featured researches published by David C. Hamilton.


Journal of Chemometrics | 1997

Maximum likelihood principal component analysis

Peter D. Wentzell; Darren T. Andrews; David C. Hamilton; Klaas Faber; Bruce R. Kowalski

The theoretical principles and practical implementation of a new method for multivariate data analysis, maximum likelihood principal component analysis (MLPCA), are described. MLCPA is an analog to principal component analysis (PCA) that incorporates information about measurement errors to develop PCA models that are optimal in a maximum likelihood sense. The theoretical foundations of MLPCA are initially established using a regression model and extended to the framework of PCA and singular value decomposition (SVD). An efficient and reliable algorithm based on an alternating regression method is described. Generalization of the algorithm allows its adaptation to cases of correlated errors provided that the error covariance matrix is known. Models with intercept terms can also be accommodated. Simulated data and near‐infrared spectra, with a variety of error structures, are used to evaluate the performance of the new algorithm. Convergence times depend on the error structure but are typically around a few minutes. In all cases, models determined by MLPCA are found to be superior to those obtained by PCA when non‐uniform error distributions are present, although the level of improvement depends on the error structure of the particular data set.


Neuron | 2009

Expression of Long-Term Plasticity at Individual Synapses in Hippocampus Is Graded, Bidirectional, and Mainly Presynaptic: Optical Quantal Analysis

Ryosuke Enoki; Yi-ling Hu; David C. Hamilton; Alan Fine

Key aspects of the expression of long-term potentiation (LTP) and long-term depression (LTD) remain unresolved despite decades of investigation. Alterations in postsynaptic glutamate receptors are believed to contribute to the expression of various forms of LTP and LTD, but the relative importance of presynaptic mechanisms is controversial. In addition, while aggregate synaptic input to a cell can undergo sequential and graded (incremental) LTP and LTD, it has been suggested that individual synapses may only support binary changes between initial and modified levels of strength. We have addressed these issues by combining electrophysiological methods with two-photon optical quantal analysis of plasticity at individual active (non-silent) Schaffer collateral synapses on CA1 pyramidal neurons in acute slices of hippocampus from adolescent rats. We find that these synapses sustain graded, bidirectional long-term plasticity. Remarkably, changes in potency are small and insignificant; long-term plasticity at these synapses is expressed overwhelmingly via presynaptic changes in reliability of transmitter release.


Technometrics | 1985

A Quadratic Design Criterion for Precise Estimation in Nonlinear Regression Models

David C. Hamilton; Donald G. Watts

D-optimal experimental designs for precise estimation in nonlinear regression models are obtained by minimizing the determinant of the approximate variance–covariance matrix of the parameter estimates. This determinant may not give a true indication of the volume of a joint inference region for the parameters, however, because of intrinsic and parameter-effects nonlinearity. In this article, we investigate experimental designs that minimize a second-order volume approximation. Unlike D-optimal designs, these designs depend on the noise and confidence levels, and on the parameterization used, and when used sequentially, quadratic designs depend on the residuals from previous experiments and on the type of inference. Quadratic designs appear to be less sensitive to variations in initial parameter values used for design.


The American Statistician | 1987

Sometimes R 2 > r 2 yx 1 + r 2 yx 2 : Correlated Variables are Not Always Redundant

David C. Hamilton

Abstract An extreme example of regression on two variables is presented in which there is almost no correlation between y and x 1 and y and x 2, yet the coefficient of determination is 1. This example illustrates the often counter-intuitive nature of multivariate relationships and is also relevant to discussions on multicollinearity and variable selection techniques.


The Journal of Pediatrics | 1989

Effect of high-dose vitamin D supplementation on radiographically detectable bone disease of very low birth weight infants

Jacquelyn Evans; Alexander C. Allen; Dora A. Stinson; David C. Hamilton; B. St. John Brown; Michael Vincer; May Raad; Caren M. Gundberg; David E. C. Cole

To test the hypothesis that high-dose vitamin D2 supplementation would result in a lower incidence of radiographically detectable bone disease, we randomly assigned 40 very low birth weight infants to a control group who received vitamin D2 in a dosage of 400 IU/day and 41 to an experimental group who received a dosage of 2000 IU/day. After 6 weeks, radiographs from all infants were scored blindly for degree of radiographic bone disease, and serum osteocalcin and 25-hydroxyvitamin D levels were measured. Mean vitamin D intake was 360 +/- 141 (SD) IU/day in the control group and 2170 +/- 144 (SD) IU/day in the experimental group. Median 6-week serum 25-hydroxyvitamin D levels were 24 ng/ml (range 3 to 60 ng/ml) in the control group and 68 ng/ml (range 9 to 150 ng/ml) in the experimental group (p less than 0.001). Overall, 20% of the infants had evidence of moderate radiographic bone disease and only 2% were severely affected. The radiographic bone score (median = 2.5) and serum osteocalcin concentration (mean = 21.7 +/- 8.7 ng/ml) in the control subjects did not differ significantly from those in the experimental group (median bone score = 2.0; mean osteocalcin level = 24.1 +/- 7.9 ng/ml). Although there may be a subset of very low birth weight infants who would benefit from high doses of vitamin D, we conclude that no generalized clinical improvement can be attributed to this regimen alone.


Annals of Human Genetics | 2009

Heterogeneous Disease Modeling for Hardy‐Weinberg Disequilibrium in Case‐Control Studies: Application to Renal Stones and Calcium‐Sensing Receptor Polymorphisms

David C. Hamilton; Vaneeta K. Grover; C. A. Smith; David E. C. Cole

Renal stone formation due to hypercalciuria is a relatively common disorder with clear evidence for genetic predisposition, but cryptic phenotypic heterogeneity has hampered identification of candidate genes. The R990G single‐nucleotide polymorphism (SNP) of the calcium sensing receptor (CASR) gene has been associated with hypercalciuria in stone formers and shows the appropriate functional phenotype in cell culture. In our preliminary association analysis of a case‐control cohort, however, we observed significant Hardy‐Weinberg disequilibrium (HWD) for the cases (n= 223), but not controls (n= 676) at the R990G locus, pointing us toward the general disease model incorporating HWD. Because there is an adjacent CASR SNP, A986S, which is in negative linkage disequilibrium with R990G, we extended the general disease model to enable testing of a two‐site hypothesis. In our data set, there is no lack of fit (P= .345) for the single‐locus model for the R990G genotype, and likelihood ratio testing favors a recessive effect with an eight‐fold increase in risk (P < .001) for GG homozygotes, relative to wild‐type, based on a population prevalence of 2%. Addition of the A986S genotype provides no additional information either by itself or when included in our two‐site model.


Technometrics | 1995

A comparison of methods for univariate and multivariate acceptance sampling by variables

David C. Hamilton; Mary Lesperance

We investigate acceptance-sampling methods for univariate and multivariate normal data in which the quality of the process relative to specification limits is measured by an estimate of the proportion nonconforming, and the mean and variance are unknown. A maximum likelihood method is developed, and we compare it with existing approaches to acceptance sampling. This method is applied to a problem involving government regulation of the gas industry in Canada. The justification for basing national and international standards on an approach based on the minimum variance unbiased estimator of the proportion nonconforming is examined.


Annals of Human Genetics | 2004

Standardizing a Composite Measure of Linkage Disequilibrium

David C. Hamilton; David E. C. Cole

The maximum and minimum are obtained for a composite measure of linkage disequilibrium used with genotypic data when the phase of double heterozygotes cannot be determined. These bounds are used to standardize the composite measure in the same way used for D′, the standardized gametic measure of linkage disequilibrium. Standardization produces a measure which lies between −1 and 1, and allows comparison of linkage disequilibrium between populations. The method is illustrated using two loci in the CASR gene.


Reliability Engineering & System Safety | 2010

Statistical analysis of competing risks models

Ammar M. Sarhan; David C. Hamilton; Bruce Smith

Statistical inference for the parameters in three competing risks models is considered in this paper. It is assumed that there are more than two causes of failure. The maximum likelihood procedure is used to derive point and asymptotic confidence interval estimates of the unknown parameters. The risks due to each cause of failure are investigated. Two sets of data are analyzed in order to (1) illustrate how the model can be applied and (2) test the hypothesis that the causes of failure follow the Chen distribution rather than the exponential distribution, or the Weibull distribution.


Computational Statistics & Data Analysis | 2011

The bivariate generalized linear failure rate distribution and its multivariate extension

Ammar M. Sarhan; David C. Hamilton; Bruce Smith; Debasis Kundu

The two-parameter linear failure rate distribution has been used quite successfully to analyze lifetime data. Recently, a new three-parameter distribution, known as the generalized linear failure rate distribution, has been introduced by exponentiating the linear failure rate distribution. The generalized linear failure rate distribution is a very flexible lifetime distribution, and the probability density function of the generalized linear failure rate distribution can take different shapes. Its hazard function also can be increasing, decreasing and bathtub shaped. The main aim of this paper is to introduce a bivariate generalized linear failure rate distribution, whose marginals are generalized linear failure rate distributions. It is obtained using the same approach as was adopted to obtain the Marshall-Olkin bivariate exponential distribution. Different properties of this new distribution are established. The bivariate generalized linear failure rate distribution has five parameters and the maximum likelihood estimators are obtained using the EM algorithm. A data set is analyzed for illustrative purposes. Finally, some generalizations to the multivariate case are proposed.

Collaboration


Dive into the David C. Hamilton's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge