Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Charles Hagwood is active.

Publication


Featured researches published by Charles Hagwood.


Journal of Aerosol Science | 1996

Novel method to classify aerosol particles according to their mass-to-charge ratio—Aerosol particle mass analyser

Kensei Ehara; Charles Hagwood; Kevin J. Coakley

A new method to classify aerosol particles according to their mass-to-charge ratio is proposed. This method works by balancing the electrostatic and centrifugal forces which act on particles introduced into a thin annular space formed between rotating cylindrical electrodes. Particles having a mass-to-charge ratio lying in a certain narrow range are taken out continuously as an aerosol suspension. A theoretical framework has been developed to calculate the transfer function which is defined as the ratio of the exiting particle flux to the entering particle flux. A similarity rule has been derived which states that a single nondimensional constant determines the shape of the transfer function. To examine the feasibility of the proposed principle, a prototype classifier was constructed, and the mass distribution of monodisperse particles nominally 0.309 μm in diameter was measured. The peak structures corresponding to singly, doubly, and triply charged particles were identified in the experimental spectra. The difference between theory and experiment in the peak location for the singly charged particles was about 6.5% in terms of mass, or 2.3% in terms of diameter.


instrumentation and measurement technology conference | 1989

The effects of timing jitter in sampling systems

T.M. Souders; Donald R. Flach; Charles Hagwood; G.L. Yang

Timing jitter generally causes a bias (systematic error) in the amplitude estimates of sampled waveforms. Equations are developed for computing the bias in both the time and frequency domains. Two principle estimators are considered: the sample mean and the so-called Markov estimator used in some equivalent-time sampling systems. Examples are given using both real and simulated data. It is shown that the bias that results from using the sample mean as an estimator can be approximated in the frequency domain by a simple filter function. The Markov estimator is shown to asymptotically converge to the population median. It is therefore an unbiased estimator for monotonic waveforms sampled with jitter distributions having a median of zero. >


Journal of Research of the National Institute of Standards and Technology | 2006

Measurement of 100 nm and 60 nm Particle Standards by Differential Mobility Analysis

George W. Mulholland; Michelle K. Donnelly; Charles Hagwood; Scott R. Kukuck; Vincent A. Hackley; David Y.H. Pui

The peak particle size and expanded uncertainties (95 % confidence interval) for two new particle calibration standards are measured as 101.8 nm ± 1.1 nm and 60.39 nm ± 0.63 nm. The particle samples are polystyrene spheres suspended in filtered, deionized water at a mass fraction of about 0.5 %. The size distribution measurements of aerosolized particles are made using a differential mobility analyzer (DMA) system calibrated using SRM® 1963 (100.7 nm polystyrene spheres). An electrospray aerosol generator was used for generating the 60 nm aerosol to almost eliminate the generation of multiply charged dimers and trimers and to minimize the effect of non-volatile contaminants increasing the particle size. The testing for the homogeneity of the samples and for the presence of multimers using dynamic light scattering is described. The use of the transfer function integral in the calibration of the DMA is shown to reduce the uncertainty in the measurement of the peak particle size compared to the approach based on the peak in the concentration vs. voltage distribution. A modified aerosol/sheath inlet, recirculating sheath flow, a high ratio of sheath flow to the aerosol flow, and accurate pressure, temperature, and voltage measurements have increased the resolution and accuracy of the measurements. A significant consideration in the uncertainty analysis was the correlation between the slip correction of the calibration particle and the measured particle. Including the correlation reduced the expanded uncertainty from approximately 1.8 % of the particle size to about 1.0 %. The effect of non-volatile contaminants in the polystyrene suspensions on the peak particle size and the uncertainty in the size is determined. The full size distributions for both the 60 nm and 100 nm spheres are tabulated and selected mean sizes including the number mean diameter and the dynamic light scattering mean diameter are computed. The use of these particles for calibrating DMAs and for making deposition standards to be used with surface scanning inspection systems is discussed.


Aerosol Science and Technology | 1999

The DMA Transfer Function with Brownian Motion a Trajectory/Monte-Carlo Approach

Charles Hagwood

The transfer function for the Differential Mobility Analyzer (DMA) is derived based on particle trajectories for both nondiffusing particles and diffusing particles. The effect of particle diffusion is assessed by using a Monte-Carlo method for particles of sizes 1, 3, 10, 30, and 100 nm. This approach includes both the effect of wall losses and axial diffusion. The range of validity of the Stolzenburg analysis is assessed by comparing his transfer function, the peak of his transfer function, and its dimensionless width with similar calculations based on the Monte-Carlo. For particle sizes smaller than 10 nm, the Monte-Carlo method indicates large wall losses, which result in a reduction in the peak of the transfer function by as much as a factor of 10 to 30, sensitivity to the flow-field, and skewness of the transfer function. It is shown that Stolzenburgs approximate formula for the standard deviation of the width of the transfer function agrees with Monte-Carlo simulations for particle sizes of 3 nm a...


Analytical Chemistry | 2013

Development of a Standard Reference Material for Metabolomics Research

Karen W. Phinney; Guillaume Ballihaut; Mary Bedner; Brandi S. Benford; Johanna E. Camara; Steven J. Christopher; W. Clay Davis; Nathan G. Dodder; Gauthier Eppe; Brian E. Lang; Stephen E. Long; Mark S. Lowenthal; Elizabeth A. McGaw; Karen E. Murphy; Bryant C. Nelson; Jocelyn L. Prendergast; Jessica L. Reiner; Catherine A. Rimmer; Lane C. Sander; Michele M. Schantz; Katherine E. Sharpless; Lorna T. Sniegoski; Susan S.-C. Tai; Jeanice M. Brown Thomas; Thomas W. Vetter; Michael J. Welch; Stephen A. Wise; Laura J. Wood; William F. Guthrie; Charles Hagwood

The National Institute of Standards and Technology (NIST), in collaboration with the National Institutes of Health (NIH), has developed a Standard Reference Material (SRM) to support technology development in metabolomics research. SRM 1950 Metabolites in Human Plasma is intended to have metabolite concentrations that are representative of those found in adult human plasma. The plasma used in the preparation of SRM 1950 was collected from both male and female donors, and donor ethnicity targets were selected based upon the ethnic makeup of the U.S. population. Metabolomics research is diverse in terms of both instrumentation and scientific goals. This SRM was designed to apply broadly to the field, not toward specific applications. Therefore, concentrations of approximately 100 analytes, including amino acids, fatty acids, trace elements, vitamins, hormones, selenoproteins, clinical markers, and perfluorinated compounds (PFCs), were determined. Value assignment measurements were performed by NIST and the Centers for Disease Control and Prevention (CDC). SRM 1950 is the first reference material developed specifically for metabolomics research.


MRS Proceedings | 1994

Interfacial zone percolation in concrete: effects of interfacial zone thickness and aggregate shape

Dale P. Bentz; Jeonghyun Hwang; Charles Hagwood; Edward J. Garboczi; Kenneth A. Snyder; N.R. Buenfeld; Karen L. Scrivener

Previously, a hard core/soft shell computer model was developed to simulate the overlap and percolation of the interfacial transition zones surrounding each aggregate in a mortar or concrete. The aggregate particles were modelled as spheres with a size distribution representative of a real mortar or concrete specimen. Here, the model has been extended to investigate the effects of aggregate shape on interfacial transition zone percolation, by modelling the aggregates as hard ellipsoids, which gives a dynamic range of shapes from plates to spheres, to fibers. For high performance concretes, the interfacial transition zone thickness will generally be reduced, which will also affect their percolation properties. This paper presents results from a study of the effects of interfacial transition zone thickness and aggregate shape on these percolation characteristics.


Metrologia | 1996

Real-time control of a measurement process

Raghu N. Kacker; Nien Fan Zhang; Charles Hagwood

The conventional industrial practice to correct (recalibrate) measuring instruments according to a fixed schedule (calibration interval) may waste money when the schedule is too tight or may provide a false sense of control when the schedule is too relaxed. Also, this approach may not generate data on real-time measurement errors that are crucial to draw managements attention to measurement concerns. We propose that the measurement process be interrupted according to an economically sensible schedule to check (interim test) the real-time errors with well-characterized check standards. When the observed error with the check standard exceeds an economic control limit, the measuring instrument should be corrected; otherwise correction is not needed. The proposed approach to limit the uncertainty of a measurement process is simple, sensible and generic. More important, it saves money by limiting the loss due to measurement error and the cost of control.


Aerosol Science and Technology | 1995

Stochastic modeling of a new spectrometer

Charles Hagwood; Kevin J. Coakley; Antoine Negiz; Kensei Ehara

A new spectrometer for classifying aerosol particles according to specific masses is being considered (Ehara et al. 1995). The spectrometer consists of concentric cylinders which rotate. The instrument is designed so that an electric field is established between the cylinders. Thus, aerosol particles injected into the spectrometer are subjected to a centrifugal force and an electric force. Depending on the balance between these two forces, as well as Brownian motion, charged particles either pass through the space between the cylinders or stick to either cylinder wall. Particles which pass through are detected. Given the rotation rate, voltage drop and physical dimensions of the device, we calculate the probability of detection in terms of particle density, diameter and charge. This is the transfer function. In this work, the focus is on situations where Brownian motion is significant. To solve for the transfer function, the trajectory of a particle in the spectrometer is modeled with a stochastic differential equation. Laminar flow is assumed. Further, attention is restricted to spherical particles with uniform density. The equation is solved using both numerical and Monte Carlo methods. The agreement between methods is excellent.


computer vision and pattern recognition | 2015

A fast algorithm for elastic shape distances between closed planar curves

Günay Doğan; Javier Bernal; Charles Hagwood

Effective computational tools for shape analysis are needed in many areas of science and engineering. We address this and propose a new fast iterative algorithm to compute the elastic geodesic distance between shapes of closed planar curves. The original algorithm for this has cubic time complexity with respect to the number of nodes per curve. Hence it is not suitable for large shape data sets. We aim for large-scale shape analysis and thus propose an iterative algorithm based on the original one but with quadratic time complexity. In practice, we observe subquadratic, almost linear running times, and that our algorithm scales very well with large numbers of nodes. The key to our algorithm is the decoupling of the optimization for the starting point and rotation from that of the reparametrization, and the development of fast dynamic programming and iterative nonlinear constrained optimization algorithms that work in tandem to compute optimal reparametrizations fast.


IEEE Transactions on Medical Imaging | 2012

Evaluation of Segmentation Algorithms on Cell Populations Using CDF Curves

Charles Hagwood; Javier Bernal; Michael Halter; John T. Elliott

Cell segmentation is a critical step in the analysis pipeline for most imaging cytometry experiments and evaluating the performance of segmentation algorithms is important for aiding the selection of segmentation algorithms. Four popular algorithms are evaluated based on their cell segmentation performance. Because segmentation involves the classification of pixels belonging to regions within the cell or belonging to background, these algorithms are evaluated based on their total misclassification error. Misclassification error is particularly relevant in the analysis of quantitative descriptors of cell morphology involving pixel counts, such as projected area, aspect ratio and diameter. Since the cumulative distribution function captures completely the stochastic properties of a population of misclassification errors it is used to compare segmentation performance.

Collaboration


Dive into the Charles Hagwood's collaboration.

Top Co-Authors

Avatar

Javier Bernal

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Kevin J. Coakley

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Kensei Ehara

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Günay Doğan

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Anirudha Sahoo

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Erica D. Kuligowski

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

George W. Mulholland

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

John T. Elliott

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Michael Halter

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Paul A. Reneke

National Institute of Standards and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge